December 2025 Archives - First Things Published by The Institute of Religion and Public Life, First Things is an educational institute aiming to advance a religiously informed public philosophy. Wed, 10 Dec 2025 16:38:43 +0000 en-US hourly 1 https://firstthings.com/wp-content/uploads/2024/08/favicon-150x150.png December 2025 Archives - First Things 32 32 The Death of Daniel Kahneman https://firstthings.com/the-death-of-daniel-kahneman/ Wed, 03 Dec 2025 06:00:00 +0000 https://firstthings.com/?p=113356 Daniel Kahneman was a Nobel laureate in economics, the author of the international bestseller Thinking, Fast and Slow, and a giant in the study of decision-making and behavioral economics...

The post The Death of Daniel Kahneman appeared first on First Things.

]]>
Daniel Kahneman was a Nobel laureate in economics, the author of the international bestseller Thinking, Fast and Slow, and a giant in the study of decision-making and behavioral economics. On March 27, 2024, he died. Not until a year later did it become known that he had taken his own life.

The revelation has received relatively little attention. It was made by Jason Zweig in an article for the Wall Street Journal on March 14, 2025. A month later, Katarzyna de Lazari-Radek and Peter Singer offered their views in an essay for the New York Times. Little else has been written, and nothing of any length. Kahneman made clear that he did not intend his death as a public act or statement, and yet it raises questions. When an expert on judgment and decision-making decides to take his own life, we can’t help asking whether his final act confirmed his reputation or undermined it. We wonder how we should approach the last years of our lives.

The reasons for Kahneman’s decision are not clear, and caution is warranted. But the articles by Zweig and de Lazari-Radek and Singer offer glimpses of his reasoning. Just before his death, Kahneman contacted several friends to inform them of his decision and say goodbye. In these messages, Kahneman explained that he was acting on his belief that “the miseries and indignities of the last years of life are superfluous.” He confirmed that he was not suffering from any condition that caused pain or disability. He was active, still capable of research and writing and of enjoying many things in life. Just before traveling to an assisted-­suicide clinic in Switzerland, he spent several days in Paris, according to Zweig, “walking around the city, going to museums and the ballet, and savoring soufflés and chocolate mousse.” Nonetheless, Kahneman was convinced that his kidneys were “on their last legs” and that “the frequency of [his] mental lapses” was increasing. He was ninety years old. “It is time to go,” he concluded.

It is hard not to view Kahneman’s decision through the lens of his work. In Thinking, Fast and Slow, he considered the way in which the last years of a person’s life might govern the evaluation of that life. His conclusion was that, when we “intuitively” assess a life, its duration means little. Most important are the peaks and ends—in other words, the most intense high points and the manner of death. To describe this phenomenon, he used his “peak-end rule.” Kahneman argued that this rule creates distortions and prevents us from thinking clearly, logically, and well. But it is not clear that he thought we could or should do much about it.

Kahneman cited many experiments in demonstration of the peak-end rule. One involved painful colonoscopies, another holding one’s hand in very cold water. But one experiment explicitly concerned the evaluation of a person’s life. A psychologist and his students developed two versions of the biography of a fictitious woman, “Jen,” who never married and had no children and “died instantly and painlessly in an automobile accident.” In the first version, she was “extremely happy throughout her life (which lasted either thirty or sixty years), enjoying her work, taking vacations, and spending time with her friends and on her hobbies.” In the second version, she lived an additional five years, dying when she was thirty-five or sixty-five. The extra years were described as pleasant, but less so than the earlier thirty or sixty years. Each participant in the study was asked to consider the desirability of a version of Jen’s life and the total happiness she experienced.

According to Kahneman, the results demonstrated that the length of a life meant little and the quality of the last years meant a great deal to the assessment of that life. Even the doubling of the length of Jen’s life had no effect on assessments of its desirability. More strikingly, the addition of five “slightly happy” years to an otherwise extremely happy life “caused a substantial drop in evaluations of the total happiness of that life.” These results confounded Kahneman. He suggested tweaks to the experiment, but these only confirmed that “what truly matters when we intuitively assess” a life—or shorter events, such as a vacation or childbirth—“is the progressive deterioration or improvement of the ongoing experience, and how the person feels at the end.”

To help him understand this result, Kahneman distinguished two types of utility, corresponding to two selves. “Experienced utility” was the amount of pleasure (or pain) a person experienced at each moment over a period of time. In contrast, “decision utility” reflected an assessment of the pleasurableness (or painfulness) of the episode as a whole. The “experiencing self” knows the experienced utility of an episode. It can answer the question, “Does it hurt now?” The “remembering self” assesses decision utility. It can answer the question, “How was it on the whole?”

When participants in the study ignored the length of Jen’s life and focused on its last five years, their remembering selves made judgments on the basis of decision utility. They assessed her life as a whole and did not think about the sum of the experienced utility of each moment her experiencing self would have known. Kahneman likened the remembering self’s indifference to time, its emphasis on peak events and endings, to storytelling. “In storytelling mode,” he observed, “an episode is represented by a few critical moments, especially the beginning, the peak, and the end. Duration is neglected.” The remembering self tells the story of our lives.

It is tempting to believe that, when Kahneman decided to end his life, he relied on the perceptions of his remembering self. From this perspective, his life would be no better if he lived an additional five or more years. In fact, it would be judged worse, because it would be marred by “the miseries and indignities” of his last years. By avoiding these last years, he would give the story of his life a more pleasant ending, and it would be judged better on the whole. But if this was his view, it would have been met with several objections.

Not the least of these was Kahneman’s insistence that the storytelling of the remembering self was wrong. By the standards of rational ­decision-making, to ignore duration and emphasize peak events and endings was irrational. The sum of Jen’s experienced utility had to be greater if she lived five years longer. Nor was it reasonable to evaluate an entire life by its last years. Kahneman called both these assessments “indefensible.” The “logical” approach was to understand a life as “a series of moments, each with a value,” and the value of a life as “the sum of the values of its moments.” “The remembering self’s neglect of duration, its exaggerated emphasis on peaks and ends, and its susceptibility to hindsight combine to yield distorted reflections of our actual experience.”

And yet, in his own life, he did not dismiss entirely the storytelling of the remembering self. He regarded this contradiction as an artifact of his humanity. Episodes of his life were shaped by recollections of peak events and their endings; they were largely unaffected by their actual duration. Kahneman was even known on occasion to end his vacations a day or two early to ensure that they produced good memories. “I am my remembering self,” he wrote; “the experiencing self, who does my living, is like a stranger to me.” Still, it seems unlikely that he would make a truly momentous decision on the basis of a judgment he considered inaccurate. To end one’s vacation on the basis of the distorted judgment of the remembering self is one thing; to end one’s life on that basis is another.

Another problem with attributing Kahneman’s decision to the judgment of the remembering self is that his remembering self was not going to be around to judge. Unless Kahneman believed in an afterlife, he could not expect to remember his life after it had ended. His remembering self would die with him. Of course, the experiment involving evaluations of Jen’s alternate lives suggests that the work of the “remembering self” is not always a work of memory. The participants did not remember and evaluate their own lives; they imagined and evaluated Jen’s. The label “remembering self” is therefore slightly misleading. The self who ignores duration and focuses on peak experiences and endings is not always engaged in remembering. Rather, it is evaluating an experience—its own or someone else’s. The remembering self must still, however, have a place to stand, a perspective from which to evaluate. In the case of Kahneman’s evaluation of his own life, this perspective is paradoxical. His life was not over; the end was not known. It seems he was imagining his life from the perspective of someone who survived him, someone who already knew the ending.

Comments from Lazari-Radek and Singer tend to confirm this. Kahneman, while still alive, judged that his life was “complete.” “Kahneman thought that he had completed his life,” wrote Lazari-Radek and Singer, presumably on the basis of their conversations with him. This is a perplexing statement. Can a life be judged complete before it is over? Kahneman seemed to think so. In his messages to friends, he suggests that anything further he could do or experience would be “superfluous.”

This judgment is especially perplexing given that he purported to believe that his life was meaningless. In the interview with Lazari-Radek and Singer, Kahneman denied that his work had any objective significance: “Other people happen to respect it and say that this is for the benefit of humanity,” but they were mistaken. “I just like to get up in the morning because I like the work.” When Lazari-Radek and Singer argued that his work was important, he disagreed: “If there is an objective point of view, then I’m totally irrelevant to it. If you look at the universe and the complexity of the universe, what I do with my day cannot be relevant.”

If Kahneman’s life was meaningless, how could it be complete? Completion assumes a whole: a story with a beginning, middle, and end; a chord, the resolving note of which has been sounded; a picture in which all is in its place and nothing is missing. A meaningless life, a life without significance, can never be complete because it is not whole. Yet Kahneman believed his life was somehow both meaningless and complete. Of course, he made no pretense to objective judgment. His sense of completion was simply “a feeling.” “I feel I’ve lived my life well,” he said, “but it’s a feeling. I’m just reasonably happy with what I’ve done.”

Lazari-Radek and Singer did not accept Kahneman’s assessment of his work or his life. They thought his work was valuable: “We do not agree that the size and complexity of the universe render irrelevant an individual’s work for the benefit of humanity.” And they thought Kahneman still had more work to do; he “could still enlighten ­audiences on how to make better decisions.” Nonetheless, they respected Kahneman’s decision to end his life: “If, after careful reflection, you decide that your life is complete and remain firmly of that view for some time, you are the best judge of what is good for you.” They added that a judgment that a person’s life was complete carried special weight “in the case of people who are at an age at which they cannot hope for improvement in their quality of life.”

But Kahneman’s staking his life on the “feeling” that it was complete remains extraordinary, given that he was a behavioral economist, much of whose work consisted of showing us that our feelings are often mistaken and distorted. Zweig suggested that Kahneman’s decision to end his life was unrelated to the principles of decision-­making that he promoted in his work. It was motivated “above all” by a desire “to avoid a long decline, to go out on his terms, to own his own death.” Zweig noted that Kahneman was deeply troubled by the death of his wife in 2018, after years of dementia. His mother likewise lost much of her memory before she died. Zweig surmised that Kahneman did not want the same to happen to him. When in his last message to his friends Kahneman stated that “the miseries and indignities of the last years of life are ­superfluous,” this meant, Zweig thought, that Kahneman believed he faced the same fate.

In short, Kahneman was scared. He was afraid that his cognitive abilities would decline along with his body. It is an understandable fear. For most of us, the ability to move our bodies easily, to hear, see, and think clearly, to live without depending on others—these seem like the minimum requirements for living well. The prospect of losing them is frightening.

We all experience fear. Some of us change our behavior as a result: We stop riding our bikes on crowded streets, pass on that trip to the Himalayas, or avoid having children. But sometimes we don’t change our behavior, even though we are afraid. In these cases, we judge the good to be attained as worth the risk.

The word for willingness to withstand fear in pursuit of a good is “fortitude” or, more commonly, “bravery.” At first glance, to suggest that Kahneman lacked bravery seems silly. This was a man who calmly and methodically faced what many would consider the ultimate and deepest loss, death. Traditionally, a willingness to die on the battlefield or in other difficult situations has been the mark of bravery. But in one of his essays on fortitude, Josef Pieper quotes Thomas Aquinas to remind us that it is not the risking of death that matters, but the realization of the good: “To take death upon oneself is not in itself praiseworthy, but solely because of its subordination to good.” Real bravery requires a correct evaluation of things, of the risks as well as what one hopes to preserve or gain. Kahneman’s willingness to face death was brave only if it was for the sake of something good—good enough to warrant ending his life.

What was the good that Kahneman hoped to gain? Fear points to what we value. Fear arises from the perception that we are in danger of losing something we believe is good. In Kahneman’s case, it appears he feared the deterioration of his mind and body. One might say that he wished to preserve his physical and mental health. But if this were the case, ending his life was not a good solution. A dead man has certainly not succeeded in preserving his health. A more exact description of the good he pursued might be: He wanted to preserve an image of himself, free from the deterioration associated with aging. He wanted to see himself, and to be remembered by others, as he was in his vital years.

In a notorious article for the Atlantic in 2014, the bioethicist Ezekiel Emanuel articulated his desire to live no more than seventy-five years. “How do we want to be remembered by our children and grandchildren?” he asked. “We wish our children to remember us in our prime. Active, vigorous, engaged, animated, astute, enthusiastic, funny, warm, loving. Not stooped and sluggish, forgetful and repetitive, constantly asking, ‘What did she say?’ We want to be remembered as independent, not experienced as burdens.” He admitted that “with effort our children will be able to recall” the good moments. But if we live much past ­seventy-five, he contended, the later years—the years of disabilities and caregiving arrangements—will inevitably become the salient memories. He concluded, “Leaving them [our ­children]—and our grandchildren—with memories framed not by our vivacity but by our frailty is the ultimate tragedy.”

The desire to be remembered as healthy and vital is only natural. We regard blossoming flowers as more beautiful than wilted ones. A prowling tiger wins our admiration more than a wounded one. And a new, gleaming building more inspires our awe than one that totters in disrepair. In each case, the stage in which an object is regarded as at its most vital defines what we consider the object to be. A flower is known by its blossoming, the tiger by its prowling, the skyscraper by its proud piercing of the sky. We can still recognize these objects when they are not at their most vigorous, but their lives take their meaning and significance from the stage in which they most fully realized their purposes.

Still, this focus on our vital years is not satisfactory. The vital years are attractive, but they are not a totality. A flower that does not wilt is artificial. A tiger that cannot be wounded is a stuffed animal, and a building that is impervious to gravity is a fantasy. A man whose body never deteriorates is not human. To end one’s life for the sake of perpetuating a memory of oneself as vigorous and successful is a sacrifice of the real to the fake. It is not bravery.

Underlying the desire to avoid the later years of life is a failure to distinguish between living and the image of living. Preserving one’s image as immune to illness and decay might be worth ending one’s life if the image were thought commensurable with those additional years of life—if the fact of existence were essentially the same as an image of existence. In that case, a few years of life could be deemed superfluous, an unnecessary epilogue. That the image is only a story, and those extra years are years of actual existence, wouldn’t matter. Life and the image of life would be equivalent.

Kahneman’s decision to die at ninety and Emanuel’s desire not to live beyond seventy-five suggest that this equivalence has some salience. For those of this mindset, the qualitative distinction between a life and the image of a life tends to dissolve. The experience of living has no greater reality, no firmer foundation, than an image of that experience. The two are commensurable. Both can be placed on a single continuum of pleasure and pain.

In The Brothers Karamazov, Elder Zosima speaks of our connection with “mysterious worlds,” in which are buried the roots of our thoughts and feelings. These mysterious worlds are concealed from us, and yet we have been granted a sense of our living bond with them. Indifference arises when this bond dies. We might fight off indifference with distractions and passing pleasures, but it is a losing battle, especially as we age and the pleasures lose their appeal.

For a social scientist like Kahneman, the risk of losing touch with these mysterious worlds would seem acute. Much of Kahneman’s work was an attempt to describe the ways in which humans were likely to behave irrationally. This work required Kahneman to keep an eye on what was rational, according to a scientific understanding that depended on objective observation and measurement. Such a focus leaves no room for mysterious worlds.

The exclusion of mystery can be seen in Kahneman’s consideration of the study of well-­being. As he became familiar with existing work in this area, he realized that almost every study depended on responses to survey questions that measured the remembering self’s assessment of well-being, not that of the experiencing self. Since he was already convinced that the assessments of the remembering self were not reliable, he sought ways to measure well-being from the perspective of the experiencing self. His solution was to assume that every moment experienced by the experiencing self and every episode known to the remembering self could be understood in terms of utility—that is, as either pleasurable or painful.

Kahneman was not naive about this categorization. He conceded in Thinking, Fast and Slow that “the experience of a moment or an episode is not easily represented by a single happiness value.” There were two obvious complications. First, feelings come in many forms. “Positive feelings” include emotions as varied as “love, joy, engagement, hope, [and] amusement.” (Likewise, “negative emotions” could include emotions as different as “anger, shame, depression, and loneliness.”) Second, positive and negative emotions may exist at the same time. An event might be experienced as amusing and shameful. Still, Kahneman insisted that it was “possible to classify most moments of life as ultimately positive or negative.”

This perspective flattens the world. Falling in love and eating ice cream differ not in the kind but only in the degree of pleasure they bestow. The death of a daughter and the traffic jam that makes us late for work differ only in the degree of pain they cause. A hug from a friend and a bite of cake are more or less the same thing. Some events may be more positive or negative than others, but in the end, all events are commensurable. All are on the continuum of utility, which cannot comprehend mystery.

Perhaps Kahneman’s utilitarian calculus was merely a method of simplifying complex issues for the sake of his experiments. But it would be difficult to leave this viewpoint at the office. Anyone who thinks in this way regularly for his work would be likely to acquire the habit in other areas of his life. For Kahneman, all was either pleasurable or painful. All that life offered fit neatly, without remainder, within the parameters of his experiments. The possibility that he might learn something new, something significant enough to change his experience of living, was evidently unthinkable.

Aging has a way of tightening our focus on matters that escaped our attention earlier. By the time we reach our seventies, the urge to make our mark on the world—through amassing wealth or power or praise—may still exist, but it will have faded. In this way, aging can clear away distractions that hid things that were there all along. What seemed a loss is a gift, for it calls attention to the fullness of existence itself.

But not every loss incurred during aging will cause us to recognize the fullness of being. Most will be experienced as losses, nothing more. Much therefore depends on whether we have an expectation of something beyond the loss, however mysterious that something may be. Entertaining this expectation is a habit of thought that determines whether we experience life as a dead end or as an adventure in understanding what it means to live well.

One might expect Kahneman to have developed this habit of thought. But we know that he regarded his work not as a pursuit of the truth about living well, which would redound to “the benefit of humanity,” but rather as a private amusement, like a crossword puzzle.

He may have had a point there. Many of the experiments on which Kahneman’s work relies seem too contrived to reveal much that is profound or helpful to living. One wonders what we can really learn about well-being or human flourishing from our reactions to painful colonoscopies or the amount of time we hold our hands in cold water. Likewise with assessments of the life of a woman who might live five more “slightly happy” years. The experiments are clever, but they are detached from the concrete experience of living. Perhaps after a lifetime of developing experiments of this kind and being handsomely rewarded for it, Kahneman sensed their hollowness.

Was Kahneman’s decision the right one? None of us can know what he faced or the reasons for his actions. Still, we can hope that when our time comes, we will have the courage to withstand the pain and indignities of aging. Elder Zosima says of those who take their own lives, “There can be no one unhappier than they.” Even though he knew the Church considered suicide a grave sin, he confessed that he prayed for them and thought “in the secret of [his] soul” that he was permitted to do so, for “Christ will not be angered by love.” Perhaps this is what we owe Daniel Kahneman: not our condemnation, but our prayers.


Image by nrkbeta, licensed via Creative Commons. Image cropped. 

The post The Death of Daniel Kahneman appeared first on First Things.

]]>
Christian Ownership Maximalism https://firstthings.com/christian-ownership-maximalism/ Tue, 02 Dec 2025 06:00:00 +0000 https://firstthings.com/?p=113361 Christendom is gone. So, too, is much of the Western civilization that was built atop it. Christians find themselves strangers and sojourners in an unfamiliar land. Aaron Renn calls...

The post Christian Ownership Maximalism appeared first on First Things.

]]>
Christendom is gone. So, too, is much of the Western civilization that was built atop it. Christians find themselves strangers and sojourners in an unfamiliar land. Aaron Renn calls this landscape the “Negative World.” In the Negative World, it is socially and politically harmful—and economically dangerous—to be publicly Christian. What should Christians do in this Negative World? How should we live in the face of the disintegration of Christendom and the civilization Christendom built?

One answer, put forward by James Shea, is to reorient Christian structures away from maintenance and toward mission. Every domain of Catholic life—schools, seminaries, lay movements, parishes, churches—should be oriented to evangelical, or missionary, activity. Shea is undoubtedly right about this. In an apostolic age, our job is to be apostles. The mission is mission.

Shea’s reframing of the Great Commission has been taken up by many Catholics in the United States. But Shea’s purpose was to offer a framework for thinking about our historical moment. His framework is not a playbook, any more than is Renn’s Negative World.

A playbook has been offered by Protestant pastors and theologians, who have long espoused the idea that Christians should expand their ownership and control of physical space, real estate, and businesses in order to build foundations upon which Christian communities can persist. This is a version of Rod Dreher’s Benedict Option, but one that does not withdraw from the world. It emphasizes the importance of asset ownership for survival in a hostile landscape. I call this approach “Christian Ownership Maximalism.”

Christian Ownership Maximalism urges us to increase our ownership of the economy’s productive assets for the explicit purpose of advancing the Kingdom of God. Christian Ownership Maximalism is, in my view, implied by the Christian understanding of ownership, or property. The relation can be explained in the following way.

First, ownership is authority. It is the legitimate exercise of power over artifacts and created things. This fact is well understood and uncontroversial.

Second, some kinds of ownership are better and more important than others. Owning the means of production matters more than owning consumption goods. It may be the case that, in theory, an owner of $1 million worth of consumption goods can trade those consumption goods for productive assets (for example, a company or real estate) that are likewise worth $1 million. But in practice, the owner of the consumption goods is dependent in a way the producer is not. The heart of the difference is not the distinction between “consumers and producers,” but that between dependent people and independent people.

This is what was meant by economists of decades past when they said that “economic power” rests in ownership of the means of production. A person who owns the means of production is not only independent in a way the equally wealthy consumer is not, but can exert power by ­withholding or withdrawing his output. These facts were obvious to most economic thinkers until recently.

Third, Christian and non-Christian ownership are different things. This is so because Christian authority is different from non-Christian authority. End, or telos, determines nature, and Christian philosophy recognizes a purpose for ownership that the post-Enlightenment understanding does not. The Christian account of property bounds its use and directs it to ends that foster human flourishing. In the Christian understanding of ownership, property is authority over a created thing, granted by the natural law, as a participation in the authority of God. The non-Christian understanding is very different. It is the liberal account that emerged with social contract theory, and it says that ownership is sovereignty, or complete power of disposition, mediated or mitigated by the social contract (the state). Property is not considered part of the natural law—in other words, ownership is not natural to man. Property is a creature of, and legitimized by, the social contract and nothing more. As such, the legitimacy of ownership is divorced from participation in God’s providential will.

The Christian account says, “I cannot do whatever I want with my property. My agency must conform to, and participate in, the will of God.” The post-Christian account says, “I can do whatever I want with the things I own, so long as I am not hurting anyone else or breaking the law.” The liberal conception of property is, “Do your will, subject to what the law says.” It may not be an exaggeration to say that the liberal conception of property is illegitimate authority masked as legitimate—that is, power masked as authority.

The Christian account has scholastic and Aristotelian roots, but its taproot is Jesus Christ. The Gospels are full of Christ’s teaching on property. Jesus’s teachings consistently emphasize the limits of personal ownership under the sovereignty of God, reminding believers that their possessions are entrusted to them by a higher authority. In the parable of the rich fool, Jesus warns: “But God said to him, ‘You fool! This very night your life will be demanded from you. Then who will get what you have prepared for yourself?’” He likewise insists: “No one can serve two masters. You cannot serve both God and mammon.” He commends the vigilant steward who uses the goods entrusted to him for the Master’s purposes. The maxim “Where your treasure is, there your heart will be also” underscores that Christian ownership confers not ­unbridled license, but an opportunity to participate in the providential will of God.

Because Christian ownership is a true account of ownership, and the Christian God is the true God, it follows that Christians should expand their “ownership share” of productive assets—not for the sake of maximizing Christian wealth, but for the sake of turning economic power into legitimate economic authority that advances Christ’s design for history. This is Christian Ownership Maximalism.

Writers such as Renn, Doug ­Wilson, and Jeff Durbin emphasize the importance of Christians’ actively securing and stewarding productive assets. They view such ownership not as a worldly ambition but as a theological and strategic necessity. They share that conviction that when Christians own and manage the means of production—whether business, land, or other resources—they are better able to shape culture according to godly principles. R. J. ­Rushdoony and Gary North, with their more systematic advocacy of Christian Reconstruction, contend that biblical governance applies to every aspect of human life, including economic structures. By urging believers to cultivate entrepreneurial ventures, invest in property, and create sustainable business models, these thinkers argue that Christians can fulfill God’s mandate to exercise dominion in ways that transform societies from the ground up.

This movement is continuous with an older Calvinist tradition. Abraham Kuyper famously articulated Christ’s lordship “over every inch” of creation, insisting that no sphere of life, from politics to the arts to economics, lies outside divine sovereignty. In this view, owning and directing the means of production is not about self-­aggrandizement or the pursuit of profit. Rather, it is a response to God’s call for stewardship in every realm in which human influence can foster the common good. Calvinist thinkers have historically emphasized the importance of disciplined work, responsible enterprise, and social engagement, positing that economic activity can and should reflect a sacred mission. In this way, the modern evangelical emphasis on building Christian economic influence echoes a centuries-old tradition of Reformed thought linking personal vocation to Kingdom-­oriented goals.

Catholics have been late to this party. This is in one sense understandable and in another strange.

It is understandable in light of the Catholic Church’s relatively uneasy relation to capitalism. While recognizing property as fundamental to the natural law and consistently condemning both Marxism and socialism, Catholic philosophy and social teaching also point out that serious dangers can arise in a commercial society. It is entrepreneurs and markets that bring us abortion, addictive devices, and the “recreational” drug industry. These anti-human market outcomes were foreseen, and warned of, by popes and Catholic economic thinkers more than a hundred years ago. In part for this reason, Catholic social teaching is relatively general in its prescriptions, emphasizing principles and urging whatever social structures might ennoble rather than diminish the human person.

But Catholic hesitation concerning expansive Christian economic authority is also surprising, for several reasons. To begin with, the Western understanding of property, and of the rights and obligations attached to it, is partially rooted in Catholic philosophy. In contrast to post-­Enlightenment views of property as a “zone of sovereignty” mediated and bounded by the “social contract,” the Catholic understanding has always emphasized that property is natural—in fact, innate—to man. Thus, the idea that ownership offers a central foundation for human flourishing, though emphasized by Protestant theology, has also been a tenet of Catholic political thought from the beginning.

Furthermore, Western civilization, which sprang from European Catholicism, was in a material sense founded on, and required, a form of Christian Ownership Maximalism. It was founded on feudalism, which imparted a social and theological dimension to ownership of the means of production. Marx recognized this. He viewed the Catholic religion as a “superstructure” whose economic foundation, or base, was feudalism. The feudal lords who owned the means of production, and therefore wielded power, did so within parameters imposed by the Christian Church, including obligations to support the works of the Church.

Finally, Catholics should recognize Christian Ownership Maximalism as a variant of what is perhaps the only version of political economy that is decidedly Catholic: the so-called distributist school. The main distributists were G. K. ­Chesterton and Hilaire Belloc, who argued that a wide distribution of the means of production promotes both freedom and virtue. It promotes freedom because property is power, and it promotes virtue because property is natural to man and demands something of him.

Renn, Wilson, and Durbin are right to say that expansion of Christian ownership is a strategic and theological necessity. It is also urgent. Christian Ownership Maximalism is imperative for the simple reason that Christians are surrounded, and what economic influence they still have must be preserved and expanded. Protestant evangelization during Renn’s “Neutral World” of 1994 to 2014 entailed an attempt by Christians to be “winsome” or “attractive.” Their efforts resembled the implementation of Vatican II by Catholics.

Notwithstanding some notable exceptions, this project was largely unsuccessful. We have lost almost all the ground we once controlled, literally as well as figuratively. In just a few decades, Christians ceded control of institutions they founded and ran for centuries: hospital systems, universities, publishing houses, and culture-related industries. Being surrounded has taught us that economic power matters. “Peak woke” and the pandemic lockdowns showed us that economic authority can be deployed effectively in the service of harmful ideas.

It is foolish to imagine that these episodes will not be repeated. Much like the Israelites venturing into a land dominated by demonic pagan deities, Christians in the Negative World find themselves in situations where faith is marginalized and rival ideologies and religions compete for supremacy. Today, the biblical narrative of entering Canaan has relevance: The people of God are not told to coexist with hostile forces; rather, our mission is to establish a foothold for righteous flourishing and the advance of a new Christendom.

To borrow R. R. Reno’s terms: We see the resurgence of the “strong gods” not only in the cultural gods of political ideology, unrestrained technological impulse, and market-driven consumerism, but also in the dark gods of the occult. The task God assigned to ancient Israel is ours also. We are called to build, safeguard, and sustain faithful institutions that testify to God’s authority and promote authentic human flourishing. This mandate can be fulfilled only if Christians preserve and expand their economic authority.

I recognize that some Christians, particularly older Christians accustomed to a world in which being Christian was at worst neutral, will consider the analogy to ancient Israel a strained one. This is unfortunate, because it is primarily older Christians who own and control productive assets. For them, perhaps a straightforward economic analogy may be more convincing.

For decades, the United States funded its trade deficits with China by selling its assets to ­foreigners—mostly to China but also to Russia, Saudi Arabia, and others. The cumulative effect is measured by something called the “net international investment position,” which is the net dollar value of U.S. assets owned by foreigners—that is, the dollar value of domestic assets owned by foreigners and foreign corporations, minus the dollar value of foreign assets owned by U.S. citizens and U.S. corporations. That figure currently stands at negative $16 trillion, which means that on net, foreigners own $16 trillion of U.S. businesses, real estate, and other productive assets. In order to fund our consumption, we have sold more than half of American economic power to foreigners ($47 trillion out of $94 trillion). We hold only about 10 percent of theirs ($31 trillion out of $320 trillion).

Secretary of State Marco Rubio has noted that this state of affairs represents a clear and present danger to the United States. At his confirmation hearing, he warned, “If we stay on the road we’re on right now, in less than ten years virtually everything that matters to us in life will depend on whether ­China will allow us to have it or not—­everything from the blood pressure medicine we take to what movies we get to watch.” By selling our assets to ­China, and by allowing an atheistic communist country to become the world’s largest owner of productive manufacturing assets, we have enfeebled ourselves almost to the point of servitude.

Now ask yourself, “What is the net international investment position of Christians relative to non-Christians?” I don’t have the exact answer, since the relevant data do not exist, but I am certain that it is deeply negative. Christians have been selling their businesses to decidedly non-Christian, and often hostile, private equity firms for nearly two decades. Main Street has been equitized, and Christian economic power was always on Main Street rather than Wall Street. Once Wall Street has completed its purchase of Main Street, Christian economic power will have been fully sold off. The same problem exists for formerly Christian-owned commercial real estate and farmland.

This, in my view, is the deep reason why Christians were forced out of their churches during the Covid pandemic and out of their jobs at woke corporations. Christians matter less and less to policy and culture because their economic power is shrinking. The reduced net investment position of Christians stands behind these episodes, and our net position is worsening.

I have five recommendations for Christian business owners.

Do not sell your business to a private equity firm that intends to “flip” it after a holding period. The traditional private equity model is to purchase a firm, grow it for three to five years, and then ­resell it to a larger private equity firm or a “strategic” acquirer in the same industry. This model has two effects that represent a problem for the Church. The imperative to grow as fast as possible, often for the purpose of servicing debt, almost always means that businesses with a Christian corporate culture are commanded to minimize their Christian character. Because we live in the Negative World, overt Christianity is seen as detrimental to revenue growth. (In many cases, this perception is reinforced by a woke orientation within the private equity firm.) Moreover, the private equity model transfers ownership upward into ever larger ownership structures in which Christians exert no influence. As private equity firms chase ever smaller companies, they destine many formerly Christian firms to non-Christian ownership. Birthrights are sold for bowls of pottage.

Sell your business instead to other Christians or to a Christian investment firm that intends to maintain your culture and own your business for a long time. Most business owners have the bulk of their net worth tied up in their businesses. For the Christian business owner, this fact presents a dilemma. If a Christian business owner sells his business to traditional private equity in order to monetize its value, the business will lose its Christian character. But if he does not sell to private equity, he may never access the wealth he h­as created. In recent years, solutions to this dilemma have emerged. We now have Christian CEO and executive organizations: C12, ­Legatus, SENT, and Convene are good examples. These organizations are networks of Christian business owners, some of whom have holding companies and are seeking to acquire successful ­companies. Similarly, evergreen funds have emerged to acquire high-­quality businesses that have been owned and operated in a Christian manner.

Be intentional with your real estate. Real estate presents an opportunity to “take up space” for the Kingdom, because it is space. Real estate owners can do simple things such as placing crosses in publicly observed areas. They can make real estate available to churches and Christian groups. They can allow priests or Christian counselors to set up “Ask Me Anything” booths. Subject to legal constraints, they can deter activities that are evil and harmful to human flourishing, as by refusing to lease space to marijuana shops and topless bars.

Operate your business in an intentionally Christian manner. Business owners should consider practices and employee benefits that promote a Christian ethos and worldview. Examples include a paid chaplaincy or an organizational structure that includes a “Chief Prayer Officer” who prays daily for the company and each of its employees. The company may provide Christian mental health counseling and a benefits program that includes support for marriage, marital counseling, financial literacy, and even debt reduction. It may give employees paid time off to volunteer in the community. Finally, consider consecrating your business to God and opening meetings with a prayer.

In a similar spirt, I have four recommendations for Christian asset allocators and wealth advisers.

Implement values-based screening and exclusion lists, and refrain from investing in companies and funds that violate Christian ethics. Obviously, this includes firms involved in pornography, abortion services, anti-family policies, or other objectionable activities.

Conduct due diligence on asset managers and custodians. Evaluate the policies and practices of financial institutions to determine whether they promote or fund initiatives that conflict with a Christian worldview. Even if a portfolio excludes certain woke corporations, an asset manager’s personal political or social activism may be at odds with Christian beliefs. Request transparency concerning the institution’s proxy voting guidelines, charitable giving, and lobbying activities. Consider smaller or specialized Christian broker-dealers or custodians who affirm biblical principles.

Espouse active ownership and take proxy voting seriously. Do not delegate proxy voting to fund managers, who may vote shares in favor of proposals that are antithetical to Christian social ethics. Vote against resolutions or board candidates that support policies that run counter to Christian values.

Diversify through Christian and faith-aligned platforms. Consider partnering with or allocating funds to dedicated Christian financial platforms—for example, specialized Christian long-term private equity, venture capital, or debt funds that prioritize Christian values in their investment decisions and governance.

All Christians should recognize that Christian ownership is fundamentally different from secular ownership. It is both an honor and an obligation. God saw fit to let man participate in his work of creation. We need to take our jobs seriously.

We must remember that Christian ownership and renunciation go hand in hand. In the Gospel of Luke, Jesus says:

Suppose one of you wants to build a tower. Won’t you first sit down and estimate the cost to see if you have enough money to complete it? For if you lay the foundation and are not able to finish it, everyone who sees it will ridicule you, saying, “This person began to build and wasn’t able to finish.” Or suppose a king is about to go to war against another king. Won’t he first sit down and consider whether he is able with ten thousand men to oppose the one coming against him with twenty thousand? If he is not able, he will send a delegation while the other is still a long way off and will ask for terms of peace.

Reading this, one expects Christ to conclude with something like, “And therefore when you set out to follow me, you should be prepared. You should plan for hardship.” Instead, he says something totally different: “In the same way, if you want to follow me you must renounce your worldly goods.”

In the same way. Christ makes renunciation analogous to planning and consideration. He instructs us that renunciation is the foundation of the authentic Christian life, which includes ownership. Yes, we should be enterprising and ambitious on behalf of our business ventures. But we must always remember that what we own is not “ours.” We are stewards, and in our economic activity we must make the words of Jesus in the Garden of Gethsemane our own: “Not my will, but thine be done.”

The post Christian Ownership Maximalism appeared first on First Things.

]]>
México Profundo https://firstthings.com/mexico-profundo/ Mon, 01 Dec 2025 06:00:00 +0000 https://firstthings.com/?p=113493 There is a narrative of Mexican history that might be called “liberal,” or perhaps more accurately “liberal-national-revolutionary.” It says that enlightened thinkers...

The post México Profundo appeared first on First Things.

]]>
The End of Catholic Mexico (1855–1861):
Causes and Consequences of the Mexican Reforma

by david gilbert

vanderbilt university, 314 pages, $34.95



Catholic Women and Mexican Politics, 1750–1940
by margaret chowning

princeton university, 376 pages, $32

There is a narrative of Mexican history that might be called “liberal,” or perhaps more accurately “liberal-national-revolutionary.” It says that enlightened thinkers and politicians in the nineteenth century, led by political liberals such as Benito Juárez, took on the entrenched power of the Catholic Church and ultimately prevailed, introducing freedom of speech, freedom of religion, and secular education in the Constitution of 1857 and further ­reforms in the ensuing decades. 

This liberal Reforma stripped away much of the inherited wealth of the Church, greatly reduced Catholicism’s political influence, and made religious devotion a private matter. Public processions, for example, were outlawed, and public schools were prohibited from teaching religion. The long dictatorship of Porfirio Díaz, known as the Porfiriato (1876–1911), undermined these gains, as Díaz, once a liberal hero, sought accommodation with both the Church and foreign capital in his pursuit of “order and progress.” The religious hierarchy regained some of its wealth and power, and piety returned to the streets and schools in much of the country. 

The Mexican Revolution (1910–20), the liberal narrative continues, restored the civil and legal progress that had been lost during the Porfiriato and went beyond the liberal era in two key areas. First, the revolutionaries of the twentieth century were committed to a broader social and economic agenda, which entailed the destruction of the hacienda and sought the prosperity of the rural poor and urban workers. Second, the liberals of the nineteenth century had embraced Mexican nationalism in theory, yet they had looked to the United States as the model liberal polity and depended on American help for some of their key military victories. By contrast, the revolutionaries of the twentieth century rejected the American model, stood up to the United States, and asserted Mexican sovereignty over the nation’s lands and minerals. 


The culmination of this account of Mexican history is the presidency of Lázaro Cárdenas (1934–40), during which the Mexican state distributed millions of acres of land to formerly landless peasants, sided with the workers in dozens of strikes, and expropriated the British- and American-­owned oil industry, turning the new state-owned Petróleos Mexicanos (PEMEX) into one of the great exemplars and bulwarks of Mexican nationalism. Later generations of politicians ­succumbed to the temptations of corruption and opportunism during the long one-party rule of the ­Institutional Revolutionary Party, but the Constitution of 1917 and the Cárdenas presidency survive as the revolutionary DNA and proof of concept. 

More recently, the neo-Zapatista uprising in Chiapas in the 1990s—never definitively defeated and, at least rhetorically, continuing to the present—demonstrates the continuing appeal of the revolutionary vision. In the liberal story, the Mexican state’s great and enduring accomplishments are its liberation of the people from the institutional Church and from Catholic “fanaticism,” the rendering of a measure of social justice for the peasant and the worker, and the assertion of Mexican nationalism against all forms of foreign domination, especially that of American capital.

This liberal narrative can be criticized from various standpoints, but I will focus here on the religious perspective, from which it is not so much wrong as incomplete. Its problem is not only that it devotes little attention to Catholicism, but that it cannot account for the depth of the Mexican people’s religious commitment. If the liberal narrative is the basic story of modern Mexico, why is Mexico still so Catholic? Unlike its neighbor Guatemala, now almost 50 percent Protestant, Mexico remains an overwhelmingly Catholic country, despite sharing a long border with that hegemon of evangelical Protestantism, the United States, which has sent Protestant missionaries into Mexico for more than a century. Why did Catholic peasants—the very people who were supposed to be the primary beneficiaries of the Revolution—fight a bloody war against the new revolutionary state in the Cristero Rebellion of 1926–29? Why did they rise again in the 1930s? In the nineteenth century, why were peasants and indigenous people more likely to side with “reactionary” conservatives than with forward-­looking liberals? There seems to be a deep-seated Catholicism in Mexico that transcends race, class, and social status and that persists despite all manner of political, legal, military, and cultural pressures. 

Historically, many liberal and revolutionary statesmen insisted that Catholics who resisted liberal and revolutionary reforms had been brainwashed into fanaticism and superstition by priests and old women and simply did not know what was good for them. Contemporary historians of Mexico, who skew left and secular, are not quite so quick to condemn Mexican religiosity, but nevertheless have downplayed Catholicism as a political, economic, and cultural force. In most cases, we have not so much a conspiracy as a mismatch between historians’ ­interests—leftwing politics, progressive moral reform, resistance to class, race, and sex-based oppression—and the undeniable centrality of Catholicism in Mexican history. At their best, these scholars recognize that they are not quite getting to the heart of the Mexican experience. 

The last two decades have seen a steady stream of scholarship on Catholic aspects of modern Mexican history, by historians such as Stephen Andes, Jürgen Buchenau, Matthew Butler, Ben Fallaw, Jaime Pensado, Brian Stauffer, William B. Taylor, Edward Wright-Ríos, and Julia Young. (There has also been some helpful work on Mexican Protestantism, a tradition that has at times mediated between Catholicism and liberalism.) Two recent books, one by David Gilbert on the Liberal Reforma of the mid-­nineteenth century, the other by Margaret Chowning on the political role of Catholic women, serve as excellent complements (or antidotes) to the liberal version of Mexican history. If liberal scholars incorporated these fine works into their teaching and research, they would present a richer and more coherent account of the Mexican past.

Gilbert’s book, based on extensive archival work and many ­previously unknown sources, is a detailed retelling of the period from 1855 to 1861, the central years of the ­R­eforma, during which, he argues, a “culture war” rapidly polarized Mexican ­society into warring camps that no longer understood or sympathized with each other. On the liberal side, the loss of more than half the national territory to the United States in 1848 was a traumatizing event that made liberals reject what they saw as the backward economy and stultifying culture of their home country in favor of the industrial dynamism and progressive outlook of the United States. On the conservative side, defeat by the United States provoked not emulation but rejection, a doubling down on the goodness of Hispanic, Catholic culture in contrast to the crass pragmatism of the neighbor to the north.  

Gilbert’s book is worth reading for many reasons, but let me focus on one: the extreme nature of liberal reform. In the United States—founded by men of ­various religious persuasions and embracing religious liberty in the First Amendment to its Constitution—liberalism seems moderate, normal, even banal. That was not the case in nineteenth-century Mexico. For centuries before the Spanish Conquest in 1521, the Aztecs, Maya, Zapotecs, Mixtecs, and many other indigenous peoples had lived under arrangements in which religion and politics were so intertwined as to be almost ­indistinguishable. Spanish colonial society replaced indigenous religion with Catholicism, but religion and politics remained deeply interconnected. Mexico did not experience the Protestant Reformation in the sixteenth century, nor did it move toward religious toleration or accommodation in the seventeenth century, as some European states were forced to do. ­Enlightened ideas did filter across the Atlantic, but they appealed only to a tiny elite. In short, in 1850 ­Mexico was Catholic, deeply Catholic, in a way that Europe had long ceased to be.


In this Catholic, conservative, traditional society, liberal ideas were explosive, unpopular, and to some Mexicans almost incomprehensible. Religious toleration was taken as a rejection of the divine claims of Catholicism. The forced sale of Church lands was seen as either cruelty or venality. The elimination of fueros (clerical privileges) was viewed as an attack not just on the priesthood but on the moral order of a hierarchical society. Liberals faced a dilemma: If they stayed to true to their ostensible principles of religious neutrality, private property, and civil freedom, the Catholic Church would remain dominant in the hearts and minds of most Mexicans; but if they acted to weaken Catholic power and influence, they would betray their liberal principles. As Gilbert documents, time after time they chose to betray their principles. They closed monasteries. They expelled priests and bishops from the country. They imprisoned Catholic leaders without trial. They outlawed criticism of the government. They confiscated Church property. To make Mexico liberal, they acted illiberally. To establish religious toleration, they acted intolerantly. Readers, even American readers, will have a difficult time viewing the Reforma as reasonable or natural. If they have eyes to see, they will recognize the shocking radicalism of liberalism.

Chowning’s book, focused on Catholic women’s activism, fleshes out some of the background to the resistance and conflict featured in Gilbert’s work. Mexican women, Chowning demonstrates, were extremely involved in Catholic lay associations in the nineteenth century. Some of these associations were dedicated to social service, but most had a more religious or ­spiritual purpose, such as devotion to Our Lady of Guadalupe or the Sacred Heart of Jesus. Chowning contends that women’s involvement in these organizations was inherently political, in that women collected and managed dues and fees, negotiated with Church authorities, and developed their own goals and procedures. Then, having built their own networks and raised up their own leaders, Catholic women entered the political arena when their interests as Catholic women were threatened, most notably in response to liberal reforms in 1849 and 1856. It was in the cities where Catholic women were most involved in lay associations, Chowning shows, that they protested most vociferously. The entrance of women into the public square scandalized liberals and, when followed by similar actions against the revolutionary reforms of the twentieth century, led them and their revolutionary heirs to deny the vote to women until the surprisingly late date of 1953. Although theoretically committed to equal rights and the liberation of women, Mexico’s liberals feared that women’s votes would be overwhelmingly ­reactionary.

Notably, Chowning shows that the most successful and popular women’s groups were dedicated to Eucharistic adoration. In each local chapter, the most important organization of the nineteenth century, ­Vela Perpetua (Perpetual Vigil), named thirty-one cabezas de día (day leaders), each of whom was responsible for organizing pairs of women to serve half-hour vigils in front of the Blessed Sacrament for twenty-four hours of one day of each month. In this way, there would be continuous Eucharistic adoration by at least two women, all day, every day. ­Chowning is most interested in how Vela Perpetua fostered female leadership, political awareness, agency, and activism—and she makes a convincing case that it accomplished these things—but to me the story of Vela Perpetua and similar organizations points to spiritual realities that may answer some of the most enduring questions about Mexican history.

I have long pondered Mexico’s spiritual vitality. How did so ­many Mexican Catholics manage to keep the faith amid the onslaught of liberalism, revolution, secularization, American invasions, Protestant evangelization, and outright persecution? In the past I have attributed their resilience to the kind of faith that is forged in adversity. As Christ says, “Blessed are those who are persecuted for righteousness’ sake.” Many Catholics also point to Our Lady of Guadalupe and the ways in which Mexico might be under her special protection. Without discounting those explanations, I’m now inclined to see adoration as the hidden source of the nation’s spiritual reservoir. Deep Mexico, México ­profundo, is Eucharistic Mexico. 

The post México Profundo appeared first on First Things.

]]>
Kings, Behold and Wail https://firstthings.com/kings-behold-and-wail/ Fri, 28 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113248 I was a full-time parish priest at a time when we still visited people in their homes. In one congregation, I made a commitment to visit every member my...

The post Kings, Behold and Wail appeared first on First Things.

]]>
I was a full-time parish priest at a time when we still visited people in their homes. In one congregation, I made a commitment to visit every member my first year. One hundred and fifty families—it was doable. On these visits the conversation might start differently, but I would eventually hear something about the church: questions, complaints, hopes, concerns, suggestions, passionate convictions. I tried to listen, to carry it back with me. It was easy to dismiss some of what was said, but not most. These were Christians; the church was a ­deeply life-giving reality, however they articulated it. It all mattered.

Even so, I couldn’t respond to many of these concerns. The parish goes on, with its vestry, its budget, its bishop, its doctrine, vision, and mission, this ministry or that. Many of those passionate convictions had to be put aside in the ordering of the whole. In a few cases, people left the church because their hopes or perceived needs were not addressed; others bit their tongues. Perhaps they assumed that their opinions were not that crucial anyway, but their sense of not being heard thinned out their sense of belonging. Most carried on, letting “the whole” guide their attitudes, not the other way around. One vestry member noted, “The Kingdom of God may involve leaving the flock in search of one lost sheep; but our church isn’t the Kingdom. Stick with the flock.”

If the parish church is not the Kingdom, at least not in its fullness, then the body politic certainly is not in any respect. If there are aspects of ecclesial life that are unevangelical in their lack of Kingdomlike regard, political action may itself be inherently unevangelical. Politics is necessary, to be sure, but lots of things are necessary that are not congruent with the gospel: killing in self-defense, using usurious credit cards to buy groceries or cover your car repairs, paying taxes knowing that a lot of it is wasted or ethically misdirected. These things are not neutral in relation to the gospel. They are in some respects contrary to it. Yet few would consider it reasonable to avoid them. It is the same with politics. Judgments and decisions must be made for the life of the community: You need politics, just to keep things going, for you, for most people.

But, alas, not for all people. Not for the one lost sheep. That fact has been a major point of theoretical contention in political philosophy. Aristotle was adamant about the communal nature of human existence and hence asserted the essential and even superior good of politics over individual needs. By the twentieth century, social thinkers as diverse as Hannah Arendt, Karol Wojtyła, and Michel Foucault demurred. They argued that a broad embrace of political action—valorizing “society” and “the state”—tends to obliterate persons. People get swallowed up in policy decisions that aim at group trajectories, incentives, aggregates, or corporate solidarity. Call it “political triage.” 

To be sure, behind this swallowing-up of ­individuals lie the best of social motives: strengthening public institutions, extending educational opportunity, mitigating poverty, expanding affordable healthcare, providing accessibility to the disabled, securing physical safety in public spaces, encouraging corporate responsibility. With every refashioning of healthcare comes a lower-middle-class family or single worker who can no longer afford it; with every opening-up of parking for the handicapped comes a small business owner who has to close up shop; with every inclusion of a once-marginalized student or topic in a classroom comes another student who is weighed down by adjustments to the schedule and curriculum. The problem is not unique to the left or right. It’s just what happens when decisions are made for the good of the whole: Something—someone—else gets left behind.

The notion and practice of epieikeia (also ­associated with Aristotle), or, as it generally came to be called, “­equity,” seeks to mitigate or adjust the strict application of political decisions (laws) to take account of the needs of individuals and their particular circumstances: to be fair, not just right. In early modern England, courts such as the Chancery existed to ensure no sheep would be lost as the community pursued its collective good. ­Equity deals with the problems of political triage, when a decision for the good of the whole is made, knowing full well that at least a few will suffer in the process. Yet the application of equity has rarely limited suffering, all the more so in modern times, when the pressures of large populations now define our common life. Large numbers are not abstractions one can easily manipulate; they are massive obstacles to governance that accounts for individuals. They inflate political triage. 

Healthcare is one of the most obvious examples. Everyone knows that what we have (and most nations have) doesn’t work very well. We can’t afford it, it doesn’t do what we say we want it to do, and in the process there are those who are enriching themselves on the backs of sick people who are not getting the care they need. There are many smart people trying to fix healthcare, and I cheer their efforts. Thus far, though, we are outwitted. It seems impossible to find solutions for large numbers of people that account for ­individual needs. Take illegal immigration, a major problem for any nation to address. This problem involves people in my parish whom I serve and love. Even reasonable and just policies will affect them, and therefore me. Those policies, no matter how immaculate in their conception, will dissolve, as it were, the eucharists we share. What to do?  

The obscuring of the person, even his loss in political action, is not simply a matter of incompetence or malice. It is a mark of fallenness. “And the cow and the bear shall feed; their young ones shall lie down together: and the lion shall eat straw like the ox” (Isa. 11:7): That is the Kingdom of God. “Will a lion roar in the forest, when he hath no prey? Will a young lion cry out of his den, if he hath taken nothing?” (Amos 3:4): This is today. The lament for the missing sheep within politics is therefore not a plea for political moderation (though that is rarely a bad thing). It is rather a plea for repentance.   

In this case, politics is a bit like preaching: You always end up blaspheming, even when you try hard to honor God’s Word. Karl Barth is meant to have said, “Every sermon is a heresy.” This happens because our finite efforts invariably focus on only one thing in the face of God’s infinite truth. I had an old priest tell me that, whenever he got up to give a sermon, not only did he begin to tremble, but as he moved along through the homily, he felt a rising sense of trepidation. He was bound to misrepresent God and his Word. By the time he left the pulpit, he was almost in tears with sorrow. “Preaching is my Ash Wednesday,” he said.  

For a long time, I thought that was a bit hyperbolic. But my own years of preaching have confirmed his worries. Not idly does the New Testament refer to the terrible “judgment” that will befall teachers, especially, as Jesus says in Matthew 18:6, those who cause “little ones” to stumble. How I have, over and over, fallen short of speaking well of God! The “Word” is indeed a “burden” (Mal. 1:1). Yet we are called to preach. Hence, the Christian vocation is inherently bound up with trepidation and sorrow, and it should not be pursued unless these attitudes form a part of its achievement. Be careful indeed! “Let not many of you become teachers, my brethren, for you know that we who teach shall be judged with greater strictness” (James 3:1).

And politics? Trepidation and sorrow are necessary and fitting attitudes as well. Christians especially should tremble in the face of politics: Every decision turns its back on some sheep, whom the Lord knows by name (John 10:3). Like preaching, it is not for most of us. Who knows what kind of person we will end up becoming if we take on its burden? The ­eighteenth-century French moralist Nicolas de ­Chamfort said, “If you want to see how far each condition of society corrupts men, examine what they are after they have had its influence for the longest time—that is, in old age. See what an old courtier is like, an old priest, an old judge.”   

I can speak for priests. Many are sorrowing, even bitter, for in their pastoring, their administration, and, alas, their preaching, they vigorously implemented broad systems for the triage of souls. And those who do not sorrow should. We lost sheep as we devoted ourselves to the needs of the flock. I cannot speak for politicians. Herod the Great, many historians now argue, was a great state-builder, Bethlehem’s babies and all. One hopes a Christian politician does better, building states without slaughtering the innocent. But better governance still involves political triage. Even the most just and upright ruler, jurist, and legislator must, on some deep level, look up to his Lord, behold his spurning gaze upon priest and king, and wail (Lam. 2:6).

The post Kings, Behold and Wail appeared first on First Things.

]]>
How to Become a Low-Tech Family https://firstthings.com/how-to-become-a-low-tech-family/ Wed, 26 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113471 Is there a life beyond the screen? In 2010, Nicholas Carr’s The Shallows described what the internet was doing to our brains. Although still relevant today...

The post How to Become a Low-Tech Family appeared first on First Things.

]]>
The Tech Exit:
A Practical Guide to Freeing Kids and Teens from Smartphones

by clare morell

penguin random house, 256 pages, $27

Is there a life beyond the screen? In 2010, Nicholas Carr’s The Shallows described what the internet was doing to our brains. Although still relevant today, Carr’s book came on the scene before two major events: the rapid proliferation of smartphones and the explosion of social media activity. In 2024, Jonathan Haidt’s The Anxious Generation showed that these new developments had ­fueled dramatic increases in rates of depression, anxiety, and other mental health issues in Gen-Z teens and young adults—and left many unable to conceive of a life that isn’t saturated by screens. 

Silicon Valley, we have a problem.

Clare Morell’s The Tech Exit is strong on solutions and strong on hope. Morell begins by laying out the problem, taking aim at two powerful myths in our culture, myths widely repeated because they sound so reasonable. The first is that if technology is harming your child, you can remedy the situation with screen-time limits. Children can use screens less—an hour a day, for example. 


As Morell points out, time limits simply don’t work. Teens crave social acceptance and peer approval, and these cravings are only amplified by screens. Moreover, digital experiences can make the real world feel so unbearably dull that, even after only a little time online, kids will keep longing to return to their devices. The result? Parents who try to enforce screen-time limits “are constantly having to stand between a drug-dispensing machine and an underdeveloped brain. It’s an untenable, exhausting situation.”

The second myth: Parental app controls are effective at limiting kids’ access to harmful online content. Morell demolishes this fanciful idea, pointing out that it’s often easy to find work-arounds and loopholes. She recounts how one boy circumvented a parental monitoring app on his phone by going into the app itself, clicking on the Support button, and then searching for porn from within the browser that opened up. 

And Morell reminds us that kids don’t even need to go looking for explicit material. Social media is a conveyor belt for content so dehumanizing, violent, and grotesque that “the porn children view today makes Playboy look like an American Girl doll catalogue.”

Though The Tech Exit includes some disturbing and tragic stories, refreshingly it focuses not on digital harms, but on reclaiming a life free of screens. Freedom begins with what Morell calls the “fast”: a total screen detox. In a culture such as ours, which operates according to the Rule of Tech Ubiquity—­technology anywhere, anytime, for anyone, to do anything—complete withdrawal may be a daunting proposal. There are nuances and exceptions in Morell’s approach, but the initial prolonged period of abstinence is essential if we are to discover who our children are.  

The first couple of weeks, ­Morell concedes, can be “hell,” with kids upset or distressed that they can’t use their screens anymore, and parents having to spend far more time with their children. (Five hours of Monopoly a day, anyone?) But as one mom observed, once tech “gets out of their system and if you hold your guns, you will see these versions of your kid that you’re like, ‘Well, if I had known this, I would have done this forever.’”  

The core of Morell’s prescription is not the fast but the FEAST—her acronym for five basic commitments toward reclaiming a low-tech life. Find and connect with likeminded families; get buy-in from kids by explaining and educating them on the harms of digital tech, and exemplifying healthy tech use; adopt alternatives to smartphones; set up accountability and screen rules; and trade screens for real-life responsibilities and pursuits. 

Morell emphasizes that families cannot be islands of tech resistance, but must join with other families in their neighborhoods and schools. This gives parents allies and kids friends who aren’t on screens. Some families sign on to a version of the Postman Pledge (named after Neil Postman, the media critic), formally affirming their commitment to limit their tech use and build a new community ethos. Specific tech-­reduction strategies include: having a landline at home; replacing smartphones with “dumb phones,” with only call and text options; giving children not just chores, but “adult” responsibilities such as cooking and shopping (or home repair, we would add); encouraging play in nature, walks, reading, board games, crafting, music, journaling, and tinkering.

All that might sound ambitious, but our impression—­having lived as a low-tech family for twenty years—is that Morell’s approach is realistic and her hope ­well-founded. Some reviews of The Tech Exit have accused Morell of idealism, as if she assumed that life will be perfect once the screens are gone or greatly minimized. This is a misreading. If anything is unrealistic, it is the conventions of contemporary parenting. 

Many years ago, long before we had our own children, we ­attended a christening. It was a solemn ­liturgical ceremony, as high as high church can get, except at the end, when the priest passed the swaddled infant back to the husband and wife and casually remarked, “Good luck with your project.”

It sent a chuckle through the congregation. It seemed just a quip at the time, but now, as we look back at that event a quarter century ago, we can’t help wondering whether the priest meant something more. Our culture, then, was drifting from a traditional view of family, which emphasized parental responsibility and self-sacrifice, toward an emphasis on parents’ personal fulfilment. 

The same attitude was transmitted to children. The result, today, is “acceptance parenting,” whose central mantra, as Mary Harrington has written, is: “I don’t mind what you do. I just want you to be happy.” If so much of what we do and makes us happy and fulfilled emanates from a screen, then the idea of a screen-free or low-tech life remains—for many parents and their children—an unfathomable proposition.  

The Tech Exit doesn’t address this foundational change in our culture. It assumes that parents will gladly take on the duty of instructing, guiding, and setting rules—such as “no private tech use” during childhood. This assumption doesn’t negate Morell’s message, but her approach might resonate primarily with parents who are comparatively traditional or authoritative in their parenting styles. 

For these parents, at least, a low-tech existence is not just possible, but fruitful—a life in which children play outside together, a life of music and games and homes full of books, where “screen entertainment is a rare treat, not a ­daily ­occurrence, and where parents might decide to equip their teenage son with a pickup truck rather than a smartphone.”

The closing chapters of Morell’s book shift from ground-up FEAST solutions—what we as parents and families can change—to top-down solutions: how our laws and policies need to change. For instance, kids can easily lie about their ­ages in order to access adult content. Our age-­verification law is useless, almost by design. Meanwhile, a law known as Section 230 is so ­abysmally written that internet companies face “absolutely no consequences for promoting child sex abuse ­material”—even if it results in the sex trafficking or death of children.

What keeps our laws so impotent? Morell, as a former adviser to Attorney General Bill Barr, writes: “Often as a bill gets close to getting a vote, either in Congress or in a state legislature, Big Tech swoops in with their armies of lobbyists and lawyers, an incredibly organized machine, and mounts a tremendous pressure and intimidation campaign to scare lawmakers away or buy them off.” Nevertheless, this is not a fatalistic book. Its last section, in particular, offers feasible pathways for change that can be implemented by schools, school districts, and entire towns.  

Brad East has observed that when Christians write about technology, they tend to rehearse truisms: “God is the source of all creativity; God made us to be makers; any tool can be bent toward sin or gospel service; what we need are wisdom and virtue and good habits.” This sentiment, when applied to smartphones, is like printing a holy icon onto a pack of Marlboros and expecting teen smoking to serve the good, the beautiful, and the true.

Still, worldview matters. Our ultimate beliefs—those that declare the “first things” of life—shape our values and behaviors. Even the best tech books are often reluctant to acknowledge this point. The illuminating chapter on spirituality in Jonathan Haidt’s Anxious Generation is about psychological feeling and spiritual practice more than existential beliefs. In the final pages of The Tech Exit, Morell discusses Maslow’s self-actualization theory and our need to transcend ourselves by “focusing on things beyond the self like altruism and spiritual awakening,” but again the theme is individual growth. 

So, although Morell demolishes popular beliefs around screen-time limits and parental controls, she sidesteps a conversation about ultimate beliefs. Yet just as her strategies and practices make sense only within a certain model of parenting, the FEAST to which The Tech Exit points us—real relationships and pursuits in the real world—makes sense only within a world­view that gives primacy to these domains. 

Could Christianity serve as this model? It might, depending on how we frame the model, and assuming we go beyond vague and watered-down recommendations about the use of our God-given powers of creativity. There are more vital imperatives to consider. We are commanded to be stewards of the primary things God made. Above all, we are commanded to love God and each other. 

Certain corollaries follow. We cannot allow virtual reality to become more important than physical reality; we cannot allow an impulsive or emotional attraction to social media or AI to become more important than our relationships of love and self-­sacrifice to real people. And if technological innovations interfere with the primary imperatives, then the innovations must be rejected or radically modified. 

Not everyone will agree with this application of the Christian worldview. But only an encompassing worldview can provide a foundation for our tech-reduction strategies. Without a foundation, individual strategies become ideas on a checklist, difficult to sustain amid technological and social ­pressures. 

The Tech Exit speaks to parents who want to save their children from the digital universe. In this way it taps into the love of mothers and fathers for their sons and daughters. A parent’s love is a powerful motivation, but not everybody is a parent or a child. Our whole society would benefit from a more principled approach to technology use. 

When it comes to managing our screens and devices, the only consensus, so far, is that if our tech makes us suffer badly enough, we should stop using it. But why make suffering the motivator? We have just undergone an uncontrolled global experiment in what smartphones and social media can do to the mental health of our kids. AI is now enticing those same kids, and us, into a new rat cage. But we are not obliged to plunge into another reckless experiment. 

Morell concludes her book with these words: “Removing digital tech from childhood is the first step, but the far greater task ahead of us is to reclaim true human flourishing.” Quite so. This ambitious and essential book throws open the door to reality. Do we have the courage to step through?

The post How to Become a Low-Tech Family appeared first on First Things.

]]>
Walker Percy’s Pilgrimage https://firstthings.com/walker-percys-pilgrimage/ Tue, 25 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113346 People can get used to most anything. Even the abyss may be rendered tolerable—or, for that matter, luxurious—furnished with creature comforts so that the unbearable truth of one’s condition...

The post Walker Percy’s Pilgrimage appeared first on First Things.

]]>
People can get used to most anything. Even the abyss may be rendered tolerable—or, for that matter, luxurious—furnished with creature comforts so that the unbearable truth of one’s condition is overlooked: a pair of lavishly upholstered armchairs, some choice pictures on the wall, bookcases laden with the best that has been thought and said, a sound system worthy of your favorite Beethoven and Brahms recordings, a well-stocked liquor cabinet, soothing overhead track lighting, and strategically placed table lamps to make you almost forget the perpetual darkness outside your walls. The hell with the world outside, anyway: You’re home, the best place there is, you’re prosperous and well liked, you’re a fine example to your children, your wife and your mistress are better-­looking than you deserve, and you can’t imagine a better life than the one you’ve got.

“The specific character of despair is precisely this: it is unaware of being despair.” So declares Søren Kierkegaard in The Sickness Unto Death, and the novelist and essayist Walker Percy (1916–1990) uses this quotation as the epigraph to his first novel, The Moviegoer (1961), which won the National Book Award, remains his most celebrated work, and heads the new Library of America edition of his early writings. Binx Bolling is the hero of this novel, and he tells his own story: Pushing thirty, a stock and bond broker in his uncle’s firm, living in a basement apartment in the nondescript New Orleans suburb of Gentilly, Binx at first seems a paragon of the dullest conformity, a surefire candidate for unwitting despair. “I am a model tenant and a model citizen and take pleasure in doing all that is expected of me.” Watching television and going to the movies are the activities that occupy most of his evenings. The local movie house advertises itself as the haven “Where Happiness Costs So Little. The fact is I am quite happy in a movie, even a bad movie.” Whereas other people cherish memories of moments of high excitement from their own lives, Binx remembers John Wayne’s gunslinger heroics and Orson Welles’s glamorous villainy. One fetching young woman or another often accompanies him to the pictures, for he falls in love, or something like it, early and often, and makes a habit of bedding his secretaries. These momentary sweethearts leave his employ when the loving dries up. “No, they were not conquests. For in the end my Lindas and I were so sick of each other that we were delighted to say good-by.”

In the hands of a novelist of more punitive temperament, a John Cheever or a John Updike or a Philip Roth, a man like Binx might be doomed to inconsequence and middle-aged defeat before his time. But Percy’s hero possesses a saving longing for a life worth living. This singular man appreciates that his daily round ought to be a perpetual renewal of “the wonder,” and he dedicates himself to the search for a worthy source of this most precious treasure. He hesitates to say that he is seeking God, for the opinion polls tell him that 98 percent of his countrymen already call themselves believers, with the remnant professed atheists or agnostics, and there is scant distinction in being “dead last among one hundred and eighty million Americans.” As he thinks things over, whether he is a laggard or a pathbreaker in his seeking is unclear to him. “Have 98% of Americans already found what I seek or are they so sunk in everydayness that not even the possibility of a search has occurred to them?”

Rescue, or at least respite, from everydayness—Martin Heidegger’s coinage, a mouthful in German, Alltäglichkeit, denoting routine and inauthentic existence—comes from the intrusion of death upon carefully modulated, well-lighted lives. For Binx, it was lying wounded by Chinese gunfire in a ditch in Korea that made him feel most alive and prompted his vow to search for the wonder. For Kate Cutrer, Binx’s stepcousin, the revelatory near-miss was surviving a car accident that killed her fiancé. As she says to Binx, 

Have you noticed that only in time of illness or disaster or death are people real? I remember at the time of the wreck—people were so kind and helpful and solid. Everyone pretended that our lives until that moment had been every bit as real as the moment itself and that the future must be real too, when the truth was that our reality had been purchased only by Lyell’s death.

The unreality of her pedestrian, ­psychiatrically monitored days and nights weighs her down, and after walking away from psychotherapy and entertaining with amused skepticism a marriage proposal from Binx that is the most feckless ­ever recorded, she takes an overdose of Nembutal. She doesn’t die, and she claims she knew the dose would not be enough to kill her. She just needed a break—to slide a bit “off-center.” “Everything seemed so—no ’count somehow, you know?” The family elders aren’t buying it, and they react with consternation to what they consider a suicide attempt in earnest.

Soon afterward, Binx takes Kate along on a business trip to Chicago, and they don’t let the family know she is going with him. Despite a botched attempt at making love, the two of them begin to realize that they belong together and might even be each other’s salvation. When they return, Binx’s outraged Aunt Emily, Kate’s stepmother, lets him have a blast of her contempt right in the face with both barrels. Her personal disgust swells into an eloquent diatribe against the rotted all-American character: “Ours is the only civilization in history which has enshrined mediocrity as its national ideal. . . . What is new is that in our time liars and thieves and whores and adulterers wish to be congratulated and are congratulated by the great public, if their confession is sufficiently psychological or strikes a sufficiently heartfelt and authentic note of sincerity.” Hers is the very best worldly wisdom: stoic, aristocratic, disdainful of gross moral collapse that nowadays passes for acceptable behavior. Mournfully she invokes the times they listened to music, read the Crito together, and spoke of “goodness and truth and beauty and nobility.” But Binx says he doesn’t love these things or live by them, and when she presses him on what he does love and live by, he is silent. There is little doubt that Percy, in some vital part of his soul, exults in ­Emily’s hottest animadversions, and certain readers have thought this harangue the author’s last word on Binx’s spiritual condition. Yet it also seems apparent that Emily is unable to see clearly into a soul as murky as Binx’s and discern the light that shines there, however dimly.

For when, by and by, Binx is moved to reflect on the state of the world and his place in it, he sounds not unlike Aunt Emily, though far less polite:

My only talent—smelling merde from every corner, living in fact in the very century of merde, the great shithouse of scientific humanism where needs are satisfied, everyone becomes an anyone, a warm and creative person, and prospers like a dung beetle, and one hundred percent of people are humanists and ninety-eight percent believe in God, and men are dead, dead, dead; and the malaise has settled like a fall-out and what people really fear is not that the bomb will fall but that the bomb will not fall—on this my thirtieth birthday, I know nothing and there is nothing to do but fall prey to desire.

To reach for the nearest warm girl—as long as she’s not Kate—is once again his reflex. But Kate is somehow the one for him, and riding a tidal wave of decisiveness and personal responsibility, he not only marries her but also honors Aunt Emily’s wish that he go to medical school.

What exactly got into our boy, and what can come of this sudden resolve to change his life? ­Percy’s philosophical learning undergirds his ­hero’s transformation, as Binx shows himself a practicing existentialist along the lines laid down by ­Kierkegaard. The quest for the wonder, the everlasting core of truth and light, Binx sets aside. As “the great Danish philosopher” has written, ­Kierkegaard himself lacked the authority to proclaim the truth but at best could only edify, and a blighted century later, Binx is in no position to do even that—“or do much of anything except plant a foot in the right place as the opportunity presents itself—if indeed asskicking is properly distinguished from edification.” Practical wisdom now guides his way. Like William James escaping the doldrums of fatalism, his first act of free will having been to believe in free will, Binx’s decisiveness proves above all that the course of his life is his to decide. The past need not bind him, nor incalculable possible futures confuse him. The romance of limitless possibility that marks Kierkegaard’s ­aesthetic man, Mozart’s hell-bent Don ­Giovanni being the exemplar, ­contracts to the settled ­intention of the ethical type, the one who loves his wife alone and wills the life that comes with such exclusivity.

Binx admits that he does not have it in him to become the type of man whom Kierkegaard considered the highest: the religious. The very word religion strikes Binx as suspect. Yet when his fourteen-year-old half-brother dies, Binx consoles the remaining children with calm certainty about the life eternal: When Our Lord raises us up on the last day, Lonnie won’t need his wheelchair, but will be like the rest of us—able not only to walk but to go skiing. Percy confirmed the insight of perceptive commentators that he had in mind here the final scene of The Brothers Karamazov, in which Alyosha assures a group of neighborhood boys that they will always remember their friend Ilyushechka, who has just died, and that at the appointed hour they will all rise from the dead, meet one another again, and tell joyously of everything that has been. The comparison with the saintly Alyosha attests to what is extraordinary in Binx’s soul. You can’t tell whether he will live up to it or retreat into everydayness, but at novel’s end you are pulling for him.

The sentiment of the Kierkegaard epigraph can stand as the motto for Percy’s entire oeuvre, much as the memorable lament “The mass of men lead lives of quiet desperation” commonly does for Thoreau’s. Despite their superficial similarity, Thoreau’s observation is really quite different from Kierkegaard’s and Percy’s: ­Thoreau’s sufferers know they are in despair. Yet Thoreau and Percy both home in on an affliction that has come to seem typically modern, and even more typically American, in its unrelenting, poignant ordinariness. How to make it through ordinary Wednesday afternoons poses a confounding moral and spiritual problem—sometimes building to a crisis—for Percy and his characters. It is the rare man—a hero—who realizes that he is a moral desperado and sees his way up and out to a triumph, however incomplete, over everydayness.

What relief, what hope, do the secular caretakers charged with the cure of American souls offer to the populace weary of their psychic burden? Not much, by Percy’s lights. Psychoanalysis was very much in vogue when Percy came of age, and while a medical student he saw his shrink five times a week for four years. In his 1957 essay “The Coming Crisis in Psychiatry” for the Jesuit magazine ­America, the veteran analysand argues that getting the patient to march in step with his biological imperatives and cultural norms is fast becoming a useless prescription for mental soundness. Such normality is stultifying, an insult to mind and heart.

We all know perfectly well that the man who lives out his life as a consumer, a sexual partner, an “other-­directed” executive; who avoids boredom and ­anxiety by consuming tons of newsprint, miles of movie film, years of TV time; that such a man has somehow betrayed his destiny as a human being.

Percy goes on to paraphrase Pascal on the soul-­killing endless diversion that keeps a man from understanding that he is in fact “the center of the supreme mystery”—that “he comes into this world knowing not whence he came nor whither he will go when he dies but only that he will for certain die.” To live your life in this hapless way is to miss the point of being on this earth, and it makes you worse than a fool.

What men truly crave is transcendence, Percy continues. He says that the existentialist philosophers he swears by all agree—even the notorious deicide Friedrich Nietzsche and his atheist epigoni Martin Heidegger and Jean-Paul Sartre. Of course Heidegger and Sartre proclaimed transcendence for the masses in monstrous political ideologies, one sage serving Hitler and the other endorsing first Stalin and then Mao, but Percy finds them suggestive as diagnosticians of the epidemic malaise. Percy, who went to medical school at Columbia and trained as a pathologist, called himself a diagnostician as well, after the manner of his medical colleague Anton Chekhov. Writing his novels and essays, Percy took his scalpel to the living dead, as he called the sufferers from a surfeit of meaninglessness who populate the upper-­middle-class enclaves in which he both lived and set his novels. What distinguishes Percy from the other artists and intellectuals who condemn modern democratic life, and whose heroes find redemption in the universal solidarity of millenarian politics or in the embrace of erotic sorceresses? Percy knows something genuinely better, something that hard experience, some luck, and the authority of an apostle helped convince him was the truth.

Walker Percy was born in Birmingham, Alabama, in 1916, the oldest of three sons. His father, LeRoy Percy, educated at Lawrenceville, Princeton, Harvard, and Heidelberg, was a lawyer with a thriving practice, a testament to his fine mind, first-rate training, and privileged upbringing. Walker grew up in a grand new house adjoining the country club of which ­LeRoy was president. Darkness, however, encroached upon the model household. In 1925 LeRoy did time in the Johns Hopkins psychiatric hospital, brought low by anxiety and depression. Bandaged wrists marked a suicide attempt, though LeRoy spoke ­only of an accident in handling broken glass. In July 1929, while Walker was away at summer camp, LeRoy climbed to the attic and killed himself with his 20-gauge shotgun. There was neither marital strife nor financial trouble to explain the self-slaughter. Walker would tell his biographer Robert Coles that his father had suffered from manic depression, a disease with a high violent casualty rate. LeRoy’s father had been prey to the same illness and had likewise committed suicide, Walker believed, though his shotgun death was officially ruled accidental. The dead fathers left a distressing emotional residue: Rumors that LeRoy’s ghost stalked the attic were convincing enough that a famous parapsychologist from Duke University came to check out the scene. The dead man’s estate left his widow, Mattie Sue, and her sons comfortably established, and the family remnant moved to Athens, Georgia.

The intercession of LeRoy’s cousin, William ­Alexander Percy, shaped Walker’s life for the better as dramatically as LeRoy’s suicide did for the worse. With astonishing goodness and nobility, Uncle Will, a lawyer, planter, and poet in Greenville, Mississippi, the son of a United States senator, a decorated war veteran, and a lifelong bachelor, offered to help raise the boys.

Uncle Will was a marvel of cultivation, his capacious house furnished with the prized possessions of an aesthetic gourmand—lots of the best of everything: Japanese paintings; Moroccan rugs; Victorian bibelots; a bronze statue of Lorenzo de’ Medici; a marble Venus; a Jacob Epstein bust of a dear friend; the latest splendid Capehart phonograph; a record collection heavy on Beethoven and Wagner, but hospitable also to Ravel and ­Shostakovich; and a library fit to deck out the mind of an embryonic writer and intellectual.

This was not the flimsy refuge against despair that LeRoy Percy’s fancy house on the golf course had been, but rather the home of a man who knew what he was and what he loved, and who dispersed happiness with both hands to all who entered, despite a sadness of his own, which he could not shake. Upon meeting an interesting neighborhood youth, Uncle Will told him that he had some young relations staying at his house, and they would welcome his company. The interesting youth was Shelby Foote, who would become a novelist and the author of an extraordinary three-volume history of the Civil War, an American classic that is as impressive an achievement as Walker Percy’s fiction. Foote and Walker would be best friends for life, each challenging and enriching the other’s mind, and just having a high old time together. The volume of their correspondence is witness to one of the most heartening friendships in American literary life.

For all the joy and promise of this new life, sorrow proved inescapable. In 1931, Mattie Sue Percy drove off a bridge over Deer Creek and died. There was speculation that a heart attack might have sealed her fate before she hit the water, but no autopsy was performed, and exactly what had happened was left in the air. Walker, for his part, was sure his mother had committed ­suicide—and even attempted murder, for his brother Phinizy had been her passenger. Phinizy had escaped to tell of his mother’s reaching out to him underwater, her foot stuck between the accelerator and the brake. Walker never told his family members of his suspicion, though he did share his angry grief with some friends. He appeared to be staggering along under the burden of an inherited curse.

Many another person would have cracked under the strain. Percy, however, persevered gamely, taking his bachelor’s degree from the University of North Carolina at Chapel Hill—where Foote was a year behind him—and then his medical degree from Columbia. His immersion in ­psychoanalysis—the cure that is worse than the disease, as the ­Viennese wit Karl Kraus put it—seems to have done him no harm, though he never said that it helped him come to terms with the family penchant for violent exits.

During the pathology rotation of his internship at Bellevue Hospital in 1942, he took part in the autopsies of some 125 tuberculosis patients and contracted the disease himself. His case was a comparatively mild one, with a persistent slight fever and ­non-productive cough—fortunately a far cry from the dreaded galloping consumption. Percy resigned from ­Bellevue and withdrew to a sanatorium in the Adirondacks. The enforced leisure, like Hans ­Castorp’s in The Magic Mountain, served an elevated purpose, as Percy examined what he really knew and how he had been living. His scientific cast of mind had taken him far along in the knowledge of human activity in this world but had stopped short of the ultimate ­questions, under the aspect of eternity. “I had found that [scientific] method an impressive and beautiful thing, the logic and precision of systematic inquiry; the mind’s impressive ability to be clear-headed, to reason. But I ­gradually began to realize that as a scientist—a doctor, a ­pathologist—I knew so very much about man, but had little idea what man is.” His physiology and bacteriology textbooks yielded pride of place to the philosophic tomes, novels, and plays of Kierkegaard, Dostoevsky, Heidegger, Sartre, ­Albert Camus, Karl Jaspers, and Gabriel Marcel—the existentialists in the vanguard of the modern understanding of what it is to be human. He came especially to cherish Kierkegaard’s improvement on the all-knowing iron-hearted philosophy of Hegel, which “explained everything under the sun except one small detail: what it means to be a man living in the world who must die.” Pondering over his most profound needs was Percy’s salvation. He would later say that catching tuberculosis was the best thing that ever happened to him.

It was upon his recovery from the illness that he became a Roman Catholic. In 1948 he and his wife of two years, whom he had married in a Baptist church, though neither had really been a believer, were baptized as Catholics. No spectacular Pauline conversion had taken place; grace worked in him by degrees over the years, as Paul Elie suggests in The Life You Save May Be Your Own, his joint biography of Percy, Flannery O’Connor, Thomas Merton, and Dorothy Day. Percy’s mind and heart and soul were swayed variously by a friend at Chapel Hill who quietly got up at dawn to attend daily Mass, a Catholic sanatorium patient who defended his faith with steamroller logic and imperturbable deftness, the formidable reputation of Scholastic philosophy, and the gathering awareness that the faith offered a wisdom unavailable to the scientific method. The undercurrent of melancholy in his beloved Uncle Will, a man of integrity, intellectual delight, and personal vigor but no belief in God, helped Percy realize that he needed more for the life of his soul than reason, aesthetic bliss, and warm friendship. And an essay by the Protestant Kierkegaard, “On the Difference between a Genius and an Apostle,” likely provided the more-than-­intellectual capstone to Percy’s conversion: Kierkegaard writes, “I have not got to listen to St. Paul because he is clever, or even brilliantly clever; I am to bow to St. Paul because he has divine authority. . . . Authority is the decisive quality.” ­Unlike the Genius, the Apostle was not ­trafficking in complicated abstract speculation but bearing news of an event that had actually happened and that he had witnessed. Percy recognized the authority of the Divine Word and set about ­living accordingly.

Percy changed his life’s direction in other ways as well, abandoning his career in medicine to become a writer. Just what kind of writer was not immediately clear, as he composed a couple of unpublishable novels and a few essays with titles such as “Symbol as Hermeneutic in Existentialism” for which journals such as Philosophy and Phenomenological Research paid him in offprints. His breakthrough came with The Moviegoer, after more than a decade of laboring in obscurity. Over the next twenty-nine years he would publish five more novels, two essay collections, and a mostly light-hearted guide to practical metaphysics, Lost in the Cosmos: The Last Self-Help Book.

In Percy’s second novel, The Last Gentleman (1966), Will Barrett—Alabama native, Princeton dropout, army veteran, maintenance man at ­Macy’s, and occasional amnesiac—falls in love with Kitty Vaught at first sight while looking through his telescope in Central Park. Themes that would become common in Percy’s work spring into action. Will recalls his father’s happiness when Pearl Harbor was attacked: “The dreadful threat of weekday mornings was gone! War is better than Monday morning.” The deliquescent lushness of a civilization in decay can be heard in Brahms’s “Great Horn Theme . . . the very sound of the ruined gorgeousness of the nineteenth century, the worst of times.” It emerges that Will, when still very young, was listening to the Brahms recording when the sound of a shotgun blast roared through the music: His father had gone up to the attic and killed himself. When there is no longer a war to fight, some men declare war on themselves. Suicide can be better than Monday morning.

Will is a man in need, and he is drawn to the entire Vaught family: the sixteen-year-old Jamie, sick with leukemia; Val, a quite modern nun ministering to impoverished black people; and Sutter, a cynical renegade physician from whom Will hopes to get an answer about how to live. Sutter, though brilliant, seems a dubious choice for healing sage. He is an intellectual steeped in existentialist thought, obsessed with suicide, anticipating with terrible disinterestedness his own death. For him it is not the glamour of evil but the triviality of men’s souls that makes the age hopelessly malignant, and he aches to be out of it: “Americans are not devils but they are becoming as lewd as devils. As for me, I elect lewdness over paltriness. Americans practice it with their Christianity and are paltry with both.”

Once again it is the presence of death that brings out the beauty latent in the hero’s soul. Will and Jamie had gotten along by studiously ignoring the adolescent’s fast-arriving end. At Jamie’s hospital deathbed, however, Will is roused to action by Val’s imperative, from hundreds of miles away, that he either secure a priest to baptize Jamie or baptize his friend himself, though Will is not a Roman Catholic and has no idea how to do it. While Sutter stands apart, sneering at the spectacle, and Jamie defecates in bed, with a scandalizing stench, Will enlists Father Boomer to administer the sacrament. When the priest tells Jamie “the truths of religion,” the patient is just aware enough to ask why he should believe in them. “The priest sighed. ‘If it were not true,’ he said to Jamie, ‘then I would not be here. That is why I am here, to tell you.’” His is the voice of authority, however inadequate this man—hesitant, clumsy, distracted by a stain on the wall—might appear for the job. Jamie seems to say something in response, and Father Boomer asks Will what it was. Will,

who did not know how he knew, was not even sure he had heard Jamie or had tuned him in in some other fashion, cleared his throat.
       “He said, Don’t let me go. . . . He means his hand, the hand there.”
       “I won’t let you go,” the priest said. As he waited he curled his lip absently against his teeth in a workaday five-o’clock-in-the-afternoon expression.

Shining humanity in the most intense moments has come to be thought the particular virtue of the humanists, as though they were uniquely endowed with the insight, courage, and compassion necessary to deal with extremity. (Think of Rose of Sharon in John Steinbeck’s The Grapes of Wrath, offering her maternal breast to suckle an old man dying from hunger.) Percy restores the claim of the faithful to this human excellence. This passage strikes one as more real—everydayness intruding upon the solemnity—and thus more moving than the famous deathbed scene in Evelyn Waugh’s Brideshead Revisited, in which the adulterous ­unbeliever Lord Marchmain emerges from unconsciousness long enough for an elegant acceptance of the last rites. Here is Percy at his best.

Others of his novels are less successful but still valuable. The Second Coming (1980) resumes the life of Will Barrett fifteen years later; ever haunted by his father’s death and the recovered memory that his father had tried to kill him, Will rants inwardly about the imminence of the Last Days, convinced that he will prove definitively either the existence or the non-existence of God by descending alone into a cave with scant supplies, where he will be saved by divine intervention or die and demonstrate that God is not there at all. After several days underground, a nauseating toothache distracts him from his sacred intention, he struggles to find his way out when his flashlight fails, and he winds up quite improbably falling into a greenhouse. The long-abandoned greenhouse is inhabited by a squatter, a schizophrenic young woman escaped from confinement: Allison Vaught, the daughter of Kitty Vaught, who had married a dentist instead of Will. Allison and Will fall in love, and Will is saved from the living death to which his father’s suicide and attempted murder had condemned him.

Will does not achieve equilibrium and ease but is stricken with a longing more insistent than ­ever before—for human love and for God. It is not existentialist authenticity he is after, but spiritual abundance. Warmth and uplift are not terms of approbation commonly deployed by serious critics of serious art; they seem more readily applicable to tales of magical children and pictures of blond kittens tumbling over one another in a joyous heap. Accusations of sappiness might seem more in order than words of commendation. But without embarrassment one can praise Percy for establishing warmth and uplift as literary virtues, hard-won in a world that has little place for them.

The last and most ambitious novel Percy wrote, The Thanatos Syndrome (1987), is also the most disturbing, shot through with his vehement hatred of the latest moral improvements in the name of tender-­heartedness and the advancement of learning for the relief of man’s estate—to invoke a founder of the modern scientific project to remake inhuman nature and human nature, the better to serve our comfort and convenience and pleasure. Warmth and uplift are the furthest things from Percy’s mind here. He is conducting a crusade of vengeance against enormities that ought to be ­unthinkable but instead are readily conceived and casually executed.

We have met Dr. Thomas More before, in Love in the Ruins (1971): Psychiatrist and inventor of the Qualitative Quantitative Ontological Lapso­meter, he was pursuing three lusty young women at once while trying to avoid the worst of the Bantu uprising, in which determined American blacks established their empire over an enervated white populace. In the later novel, the Bantu heyday is over and it is a new era in America, as the Supreme Court has declared “pedeuthanasia” legal for infants who face “a life without ­quality,” and assisted suicide for the old is proceeding apace. Dr. More, for his part, is bemused by the increasing incidence of patients who had been overcome by anxiety, insomnia, drug addiction, or sepulchral depression and now are preternaturally alert, chipper, brilliant, and rapturous with the joy of being alive. But there are also cases of quite normal people who suddenly commit acts of mindless violence—­typical sufferers of “pure angelism-­bestialism,” who “either considered themselves above conscience and the law or didn’t care.” With Sherlockian investigative address, Dr. More and his posse trace these bizarre psychic eruptions to the presence in the water supply of Na-24, heavy sodium isotopes, which in therapeutic dosages produce extraordinary mental acuity and emotional exuberance, but at toxic levels cause regression to instinctual animal behavior. Benevolent scientists, physicians, and government types have been conducting a local trial run of Na-24, with the prospect of going national or even global in due course. Dr. More throws a wrench in the works and saves the day. While he’s at it, our hero breaks up a ring of pedophiles at a leading private academy, who have used the isotopes to drug the children into blissed-out submissiveness, and whose diabolically cunning leader declares blandly that he is undoing two thousand years of hatred for and shame at the human body, replacing them with perfectly natural loving-kindness.

Percy gives his signature line to Father Smith, who has taken to a fire-watch tower after the manner of St. Simeon Stylites, and who reflects ­continually on the abominations of the twentieth century, conducted in the name of compassion and human perfectibility: “Tenderness leads to the gas chamber. . . . More people have been killed in this century by tender-hearted souls than by cruel barbarians in all other centuries put together.” Percy’s satire burns in this novel, razing to the ground the best intentions of modern scientific humanism, the prevailing wisdom of our time and place, which he has rejected because he recognizes that it is inhuman, and because he knows of something better.

With Saul Bellow he is by far the most intelligent and the most decent of recent American writers. Percy’s impassioned absorption in mid-­twentieth-century philosophy nourished his intellect, while his innate and inviolable goodness, an emotional brightness at his core, enabled him to avoid the excesses of such dark luminaries as Heidegger, Camus, and Sartre. Sartre’s 1938 novel Nausea had an especially stinging impact on Percy, yet Percy’s stalwart nature prevented this compact nihilist missile from doing lasting damage. For Percy came to understand Sartre’s cry of irredeemable grief—that all is pointless, life a useless interval of pain between two accidents—to be at best a fragmentary truth, conceived of a fascination with one’s own brainpower. To treat it as more than that was to live a lie. Percy could see his way past despair because he had more in common with the two great nineteenth-century religious existentialists, Kierkegaard and Dostoevsky, than with the atheist thinking-engine Sartre. Reason run rampant led inevitably to horror at existence and the embrace of nothingness. Earthly salvation, which might mean freedom from persistent thoughts of suicide, came not from the arrogant mind of genius but from the soul submissive before divine authority.

Percy knew all the ways the mind can go wrong, whether from organic affliction or from a hypertrophied confidence in one’s own intellect. Thorough scientific training gave him the expertise that the age values most, and his philosophizing, strengthened by inborn virtue, revealed to him the deficiencies of that expertise. Appreciative of the wonder and what it takes to be always aware of it, he lived by and wrote with the soul’s knowledge. Walker Percy was wise as few men can hope to be, and we have never been in greater need of such wisdom as we are now. May his reputation grow and flourish.

The post Walker Percy’s Pilgrimage appeared first on First Things.

]]>
Taming the Tongue https://firstthings.com/taming-the-tongue/ Mon, 24 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113244 On October 14, Politico reported on a group chat in which leaders of various Young Republicans groups seemed to vie with one another to see who could say the...

The post Taming the Tongue appeared first on First Things.

]]>
On October 14, Politico reported on a group chat in which leaders of various Young Republicans groups seemed to vie with one another to see who could say the most offensive thing. Brianna Douglass, a Vermont national committee member, chided another member of the chat for “expecting the Jew to be honest.” When Peter Giunta, head of the New York State Young Republicans, was asked whether he was watching an NBA game, he replied: “I’d go to the zoo if I wanted to watch monkey play ball.” Other members responded approvingly.

As I read these messages, I recalled a seemingly very different exchange. In the spring of 1908, Virginia Woolf and her sister, the painter Vanessa Bell, were sitting in a London drawing room. In walked Lytton Strachey, whose Eminent Victorians appeared ten years later. He pointed at a stain on Bell’s white dress and asked, “Semen?”

Woolf later described her incredulous thought: “Can one ­really say it?” She and her sister burst out laughing. “With that one word all barriers of reticence and reserve went down,” she recalled. This moment marked a revolution in the mores of the Bloomsbury Group. Modesty and reserve gave way to the thrill of transgression. “We discussed copulation with the same excitement and openness that we had discussed the nature of good,” Woolf wrote. From that point on, the word “f*ck” sprang to their lips.

Quentin Bell, Woolf’s nephew and biographer, would later suggest that this moment marked a transformation not only in the manners of the Bloomsbury Group, but in the manners of the middle classes more generally. Judging by present evidence, he was right. The word that Woolf once found so ­delightfully shocking is now ­commonplace, broadcast to millions on television and radio, and regularly used in what was once called polite company.

The culture of transgression has been universalized. There are advantages to this—most notably, the spread of what Woolf’s husband Leonard called a “sense of intimacy and complete freedom of thought and speech”—but there are also downsides. When it is ­unremarkable to say “f*ck” in polite company, then other things become sayable as well. And when the thrill of transgression is pursued not only by a daring artistic set but by the culture at large, a great deal of crude transgression will result. The average man is not as clever as Lytton ­Strachey. He will try to get his friends to think, “Can one really say that?” by posting a Hitler meme.

This is the logic of edgy group chats like the one in which the Young Republicans took part. Seeking a feeling of intimacy and freedom, people say shocking things. But four-letter words no longer suffice. Our society is no longer structured around the observance of sexual propriety; it is structured instead around the value of tolerance. In order to be transgressive, one must speak in intolerant terms.

Our culture of tolerance is supported by two triumphal narratives: America’s victory over Nazi Germany in World War II and the success of the civil rights movement in the 1960s. Both narratives are now subject to challenge. One reason for this is the simple passage of time. When I was thirteen, my grandfather, who had fought the Germans in World War II, died. As my grandmother went through his things, she asked me to try on his old dress uniform. It was too small.

Young people who lack such memories are more likely to share Hitler jokes. They may also be tempted to transgression by the sense that these triumphal narratives have not always been invoked for good ends. The Churchill cult has been repeatedly invoked to justify ill-advised wars. Similarly, the moral prestige of the civil rights movement is now used to harass people such as Colorado baker Jack Phillips, whose only crime was to refuse to bake a cake for a gay wedding.

In this way, America in the 2020s may be somewhat like the England of the early 1900s. The old values are losing their sway. Then it was moral correctness that lost its grip on those who wished to transgress boundaries; now it is political correctness. The former is necessary for a decent society, but there is little left of that kind of constraint. Properly understood, the latter encourages civic politeness, a socially useful notion in a pluralistic society. But political correctness became a progressive weapon decades ago and was thereby discredited, especially among young people on the right, who have known no other use. Nothing new has arisen to take the places of these two forms of correctness.

After the Young Republicans chat was leaked, Chuck Schumer called on all Republicans to denounce the remarks “swiftly and ­unequivocally.” Several did, including the New York Republican Elise Stefanik. The Kansas GOP disbanded its Young Republicans group. The party in New York followed suit.

Vice President JD Vance, however, dismissed the response as “pearl clutching” and pointed out that the Democratic nominee for Virginia attorney general, Jay Jones, had fantasized about shooting a ­Republican colleague and expressed hope that his children would die because “only when people feel pain personally do they move on policy.” Andrew Kolvet, a leader of the conservative group Turning Point USA, likewise declined to condemn the remarks.

Is America simply too polarized to engage in universal condemnation of any specific instance of crude and offensive speech? Possibly. For that reason, I would like to make a proposal that can be regarded as perfectly non-partisan. Swear words should once more be rigorously excluded from any polite or public setting. This measure would elevate public discourse and diminish the cult of transgression, thereby reducing both the opportunities for and the appeal of offensive speech.

This idea came to me after I read an essay by the English professor Mark Edmundson on how inescapable swearing has become. Not only in the military barracks, but in many offices and middle-class homes, one hears “f*cking” this and “sh*t” that. Such words erode our respect for others and ourselves, an issue not unrelated to the kinds of demeaning speech found in that group chat. Their use promotes the idea that a man who wants to seem honest or real must speak ­i­mpolitely. At some point that impulse will migrate from four-letter words to slurs. Edmundson ended his essay by observing that his father, who never swore, was “one of the very few white men I knew well growing up whom I never heard say anything racist.”

His experience was not unique. Growing up in small-town ­Nebraska in an evangelical church, I never heard my parents or their friends make racial remarks or other slurs, just as I never heard them say “sh*t” or “f*ck.” Those kinds of speech were unthinkable for them, and for related reasons. Their entire ­outlook opposed the valorization of ­transgression. Restraint is required for politeness—regarding race as much as sex.

The situation was slightly different when I started doing construction work. But not every place has to sound like a job site. Until we once again build a culture of modesty and restraint, we will continue to learn of offensive statements by people who aspire to public roles. They are only doing what they were taught by a society that has ceased to believe in anything except the value of transgression. Blame ­Lytton Strachey. 

The post Taming the Tongue appeared first on First Things.

]]>
Overcoming Nihilism https://firstthings.com/overcoming-nihilism/ Fri, 21 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113254 Shoah is the Hebrew word for catastrophic ­ruin and unmitigated disaster. It appears in Psalm 35 as an imprecation against enemies: “Let ruin come upon them unawares!” It’s also...

The post Overcoming Nihilism appeared first on First Things.

]]>
Shoah is the Hebrew word for catastrophic ­ruin and unmitigated disaster. It appears in Psalm 35 as an imprecation against enemies: “Let ruin come upon them unawares!” It’s also used in Zephaniah and elsewhere to describe the desolation brought by divine judgment and in Proverbs for the inevitable ruin brought by the ways of the wicked.

The Camp of the Saints, the controversial and until now fugitive novel by Jean Raspail (retranslated and republished by Vauban Books), is a story of shoah. As Nathan Pinkoski observed in his assessment of The Camp of the Saints (“Spiritual Death of the West,” May 2023), ruin and loss were the novelist’s preoccupations. In other books, Raspail imagined the moral and spiritual experience of cultural destruction and desolation. In The Camp of the Saints, he imagines something different. He depicts the more complex and terrifying fate of self-destruction and self-chosen destitution.

Shoah is also a biblical term used to refer to the Holocaust. That historical event has had a searing effect on the West. But in my estimation, Auschwitz can be understood as a condensed symbol of the many catastrophes that led up to the death camps. The early decades of the twentieth century saw unprecedented slaughter in the trenches of the Western Front, a bloodthirsty revolution in Russia inspired by an ideology born in the bosom of Western Europe, and economic depression and dislocation that triggered political upheavals. 

Many reflective and sensitive people hungered for revolution in the 1930s. Some endorsed communism; others threw in for fascism. Different though these choices may have been, they were made against a shared judgment that the civilization of the West had reached a dead end. It was in this atmosphere of revolutionary nihilism that another world war broke out, more destructive and bloodthirsty than the first.

Germans spoke of their defeat in 1945 as Stunde Null, “zero hour.” It was the time when everything that came before, spiritual as well as material, was reduced to rubble. France did not suffer complete collapse, but after its liberation, it endured a painful and often violent reckoning over wartime complicity and compromise. Americans like to think of World War II as our “good war,” in which we defeated evil forces. But we should not neglect the effect the war had on many. John Rawls was scarred by his visit to Hiroshima shortly after the war’s end. By some accounts, that experience played a role in his loss of faith. And, of course, those who liberated the concentration camps glimpsed the shoah, not just of the Jewish people, but of their own civilization.

There are many ways to tell the story of the rebuilding of the West after 1945. One focuses on French existentialism, which captured the imaginations of many in the 1950s. The reason was simple: Sartre, Camus, and others outlined how to exist in a metaphysical vacuum, a world in which no inheritance can be honored, no authority can be trusted, no truth believed. Existentialism was irresistible, because it offered a way forward when everything has been discredited.

But existentialism’s influence was limited. After all, life goes on. In the war’s aftermath, men took up the tasks of governance; they rebuilt shattered economies and breathed what life they could into established institutions and traditional authorities. But their hearts were not in it. Soon after the war’s end, Camus drew upon Nazi rhetoric with artful irony to formulate the truth of Western civilization’s condition: “Disaster is today our common fatherland.” 

With his characteristic humanity, Camus saw in this statement a glimmer of hope. We could at least be in solidarity in our shared desolation, in our suspicion of our inheritance, in our sense that the truths and authorities we once believed in had been exposed as clay idols, smashed by historical events.

In Return of the Strong Gods, I give a sketch of how the inescapable nihilism of Camus’s formulation was transformed into a positive program for the reconstruction of the West. The open society consensus took shape, formulated by figures such as Karl Popper and Friedrich Hayek and then developed by others. But I don’t want to rehash those details. It’s sufficient to illuminate the logic of shoah, which has a paradoxically happy side, not just a sad one. Like the glamor of evil, emptiness has its allures.

One promise of complete destruction is freedom. ­Sartre played up this aspect when he noted that, due to the lack of any metaphysical truths, it falls upon us to create our own meaning. (Popper said something similar in The Open Society and Its Enemies, as have many others.) With his usual Cartesian rigor, Sartre advanced an anti-metaphysical doctrine to ensure this freedom: Existence precedes essence. Human experience is malleable, and reality is ours to make and remake. This open field for action has a political as well as a personal aspect, which is why Sartre’s doctrinaire and revolutionary Marxism was entirely consistent with his existentialism. 

The open society consensus never embraced explicit nihilism. Rather, it turned the destruction and diminution of the West’s inheritance into the blessings of pluralism, inclusion, and peace. In the hands of Isaiah Berlin, the liberal virtue of tolerance became a metaphysical doctrine of pluralism. Although seeking to avoid nihilism, this doctrine accommodates the reality of a shattered world. Old truths that once seemed monolithic are broken apart. The upshot, Berlin and others hoped, would be a society that is more welcoming and inclusive—a borderless world.

Although initially formulated by establishment liberals who hoped to lay enduring foundations, the open society consensus had few antibodies to protect against revolution. Leaders may impose pragmatic ­limits—changes can be deemed too costly or too difficult to implement. But under the shadow of disaster, revolution can become irresistible. Revolution at least seeks a future we can affirm and champion.

Nihilism also promises peace: If nothing is worth fighting for, nobody will fight. Raspail grasped this aspect of nihilism. Note the novel’s denouement. There is no cataclysmic conflict at the end. The migrants land unmolested. Yes, there is violence, but it occurs on the margins, rather like today’s drip-drip of terrorism countered by SWAT teams and drone assassinations in remote regions of the world.

And I must not fail to mention prosperity. One benefit of civilizational collapse is the elimination of impediments to commerce and innovation. Without reverence for an inheritance, we are free to treat everything as raw material. In recent years, my appreciation for ­Alexandre Kojève has increased. I now understand him as attempting to theorize a stable and managed condition of ­nihilism—which, if that is our fate, is better than the unstable and disintegrating version we see more and more of these days.

Pinkoski rightly notes that The Camp of the Saints is not a novel about the West against the Rest. It’s a story about civil war within the West. In that war, Raspail imagines a decisive moment. One million migrants are poised to land in France. In this apocalypse, the final victory of the nihilism at the heart of the postwar West is exposed. The collapse is immediate and complete. 

In this respect, Raspail was not a prophet. The civil war is not ending; it is just beginning. The failures of the open society consensus in economics, culture, and foreign policy are now evident. The reaction known as populism has gathered strength. Rather than witnessing an apocalypse, we are entering a long struggle at the end of an era. My image for the end of the postwar era is “the return of the strong gods.” The phenomenon, however we describe it, is real. Flags are being waved; the warm loves of people and place are stirring. Old truths are reemerging. Metaphysical imaginations are being rekindled. In some quarters, God’s authority is being reasserted. 

I’d like to end with a word of counsel. I hope that readers of First Things will be combatants in the struggle against nihilism in the West. In that struggle, we must be patrons of one or another of the strong gods. Yet, we should resist the notion that our adversaries are necessarily willful enemies of the West. Some are. But most are children of the sad history of the twentieth century, which we share, even as we seek to overcome the last century’s solidarity in shame. As Raspail saw so clearly, negation, especially self-negation, can create no lasting fatherland. In the rising civil war over the future of the West, let us be guided, therefore, by affirmations. For they create a solidarity more generous than Camus’s fragile and failing brotherhood of shared disaster.

The post Overcoming Nihilism appeared first on First Things.

]]>
Finest Pieces of Plastic https://firstthings.com/finest-pieces-of-plastic/ Thu, 20 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113450 Writing early in 1810, diplomat and scholar Georg Griesinger gave the most detailed surviving account...

The post Finest Pieces of Plastic appeared first on First Things.

]]>
The Genius Myth:
The Dangerous Allure of Rebels, Monsters and Rule-Breakers

by helen lewis

penguin, 320 pages, $30

Writing early in 1810, diplomat and scholar Georg Griesinger gave the most detailed surviving account of the working methods of his recently deceased friend, the composer Joseph Haydn: 

He was very strongly convinced in his heart that all human destiny is under God’s guiding hand, that God rewards the good and the evil, that all talents come from above. All his larger scores begin with the words In nomine Domini and end with Laus Deo or Soli Deo Gloria. “If my composing is not proceeding so well,” I heard him say, “I walk up and down the room with my rosary in my hand, say several Aves, and then ideas come to me again.” 


By any standard, Haydn possessed genius and—a very different thing—was recognized as a genius by his contemporaries. In his lifetime (which he spent almost entirely within 100 miles of Vienna), the reputation of his music spread across Europe. He received commissions from Spain and France, and when, late in life, he finally traveled to London, he was lionized. Critics spoke of his imaginative fire, Mozart addressed him as “Papa,” and his former pupil Beethoven knelt in public to kiss his hand.  

You won’t find Haydn mentioned in the pages of Helen Lewis’s new book The Genius Myth, and not simply because Lewis shows almost no interest in the art to which I devote my own waking hours: Western classical music. That’s fair enough. Anyone dealing with a subject as vast as genius must be selective, and Lewis is aiming at a popular ­audience. (She has some interesting, if not particularly original, things to say about the Beatles.) The problem lies in Lewis’s subtitle: The Dangerous Allure of Rebels, Monsters and Rule-Breakers. Haydn’s career doesn’t fit that mold, or indeed many of the patterns of behavior that Lewis places in the foreground of her polemic.  

There was no dazzling youthful breakthrough followed by decades of self-indulgent coasting. Haydn published his first truly revolutionary string quartets at the age of forty-two and is generally held to have written his best music in the two decades before his death at the age of seventy-seven. There was no oppressed wife patiently enabling the Great Man. (Haydn’s estranged wife derided his music and low social standing, though he supported her financially until her death.) His reputation was not the product of posthumous mythmaking. (It was fully formed within his lifetime.) Haydn upheld the social order, credited his gifts to God, and was widely described as a modest and compassionate man. He made generous provision for his servants in his will. 

And so on. The same, or similar, could be said for a great number of pre-Romantic geniuses, though Lewis does quite a good job of winkling out the exceptions that serve her purpose. (Isaac Newton is the most notable, though Lewis finds further examples in Vasari’s Lives of the Artists, which she sees as a founding document of the Romantic cult of genius.) By singling out Haydn, of course, I’m cherry-picking. Having decided on the point I wish to prove, I’ve chosen the example best suited to prove it. But that seems as valid a way as any to approach a book that operates on essentially the same principle. Lewis believes that a reputation for genius gives social sanction to obnoxious behavior, and here’s a whole book of case studies, handpicked to reinforce that idea.

Anyone seeking a sustained or dispassionate interrogation of the nature of genius—the capacity of certain creative individuals to perceive and articulate something inaccessible to normal human understanding—will not find it here. Lewis opens with an exploration of “bardolatry” (the posthumous cult of William Shakespeare) and ends with the inevitable takedown of Elon Musk, and it’s hard to avoid the suspicion that the first is present primarily to enable the second. Certainly, anyone who believes that the current company at London’s Globe ­Theatre exists “to recreate the original performances as authentically as possible” can’t have seen much live Shakespeare there lately.  

No matter: As early as the first page, a throwaway parenthesis warns us that Shakespeare was a “populist,” and Lewis’s preferred readers (she’s a writer for The ­Atlantic, having previously worked at the New Statesman) will know precisely what to read into that. ­Lewis isn’t here to examine the roots of what makes Shakespeare—or Newton, or Picasso, or Tolstoy, to name a few of her targets—matter, but to unravel why we think they matter, with the implication that they don’t, really, or at least not as much as we’ve been told they do. Her thesis is that familiar progressive ­Theory of Everything, that greatness is a socio-political construct, and she has a flair for the dismissive zinger. “Shakespeare might have started out writing for the groundlings, but he ended up working for the Warwickshire Tourist Board.” 

Lewis acknowledges but sidesteps the Haydn problem, which she terms the “Austen Problem”: the countless geniuses of all eras who contradict her thesis by refusing to fall into the toxic stereotype. Her ­real beef (and it’s hard to disagree with her on this) is with the post-Romantic genius myth—the idea of Man as autonomous ­creator, a Promethean overthrower of norms, exempt from ­conventional morality. From Homer until the age of Haydn, creative humans acknowledged that their abilities were the gift of a higher power. The difficulties began when they started to regard those gifts as intrinsic to their own being. The ancient Greeks called it hubris; Catholics call it the sin of pride. At no point in history would either people have been ­surprised—as Lewis apparently is—that the upshot is “monsters, rebels and rule-breakers.” 

Those sacred monsters, rather than the infinitely more complex question of the spirit that inspired them, are Lewis’s main subject, and they certainly make for an engrossing read. When you’re holding a hammer, the temptation is to see everything as a nail. But sometimes a nail really is a nail, and Lewis lands some satisfying hits. Her style is lucid, lively, and fun to read. She has a journalist’s nose for a good story and the same relish in retelling it. The chapter devoted to Richard Curtis’s 2019 film Yesterday (in which a failing musician wakes up in a parallel universe in which the Beatles never existed and promptly capitalizes on their back catalogue), and the largely forgotten screenplay with the same premise on which it turns out to have been based, is particularly juicy.  

Lewis rifles enjoyably through the soiled laundry of famous men: Isaac Newton’s academic rivalries, Tolstoy’s emotional cruelty. Forgotten, supportive wives (like Tolstoy’s devoted Sofia) are brought out of the shadows and, in line with current thinking, are implied to be the true source of their husbands’ creativity. There are serious points about the distorting nature of the post-­Romantic genius myth on ­scientific careers. That Thomas ­Edison had feet of clay is old news, but in the twenty-­first century—when almost all serious scientific work is conducted collaboratively by large (often international) teams of researchers—the fact that the Nobel Prize continues to single out just one name seems unhelpful at best.  

Overall, Lewis seems more at ease with science than with art, and she’s at her most compelling in the first section of the book, in which she anatomizes the nineteenth and twentieth centuries’ delusional attempts to measure and predict ­intellect—with results ranging from the ludicrous (the British high-IQ society ­Mensa instructed its members to wear yellow map pins in their lapels “as a ‘sign of genius’”) to the outright eugenicist. Generally, the results spoke for themselves. Lewis cites Victor Serebriakoff’s description of Mensa as “brain-proud Quarrelsome Underachievers Limited,” and she recounts the failure of the California-based Repository for Germinal Choice, more popularly known as the “Nobel sperm bank.” 

In the end, one of the few Nobel laureates to leave a deposit in the bank was the physicist William Shockley, whose youthful Nobel Prize (Lewis suggests) was far from well earned, and whose conviction of his own genius gradually curdled into racial supremacism. Lewis presents him as an extreme case of “galaxy-brainedness”—the modern delusion (endemic in academia) that genius-level achievement in a specialized field automatically makes one an expert in all fields. (At the very least, Shockley sounds like he would be insufferable at breakfast. “What law of nature have you discovered?” he snapped, when a house guest contradicted him.)  

That’s one of many cautionary tales in Lewis’s book. Her judgments are pointed and her arguments often persuasive, at least until her detachment slips. A modern parallel for Shockley’s “galaxy brain” delusions, she suggests, “might be a successful entrepreneur who succumbs to paranoia about the ‘woke mind virus.’” Well, yes. It might also be an experienced senior politician who has become so convinced of her own moral superiority that she dismisses potential voters as “deplorables.” Either would do; neither analogy is likely to persuade the unaligned.  

These partisan asides are particularly frustrating because elsewhere Lewis is willing to examine the selective blindness of her own political tribe. A troubling, thoughtful chapter discusses the wokest-of-the-woke theater director Chris Goode, whose work was critically lauded in the left-wing press until his exposure as a pedophile in 2021. Lewis’s retelling throws into uncomfortable profile—without ever quite addressing—the far-from-mythical nature of genius itself: the ideas, insights, and artworks for which these messy and sometimes monstrous people are the conduit.  

That’s the really thorny question, of course, and Lewis acknowledges that she’s avoiding it. She quotes Rebecca Solnit on the way that attacks on the “genius myth” are too often a means of avoiding the far harder task of grappling with the work and its implications: “I did a quick online search and found a long parade of people who pretended to care who did Thoreau’s laundry as a way of not having to care about Thoreau,” Solnit observes. “They thought of Thoreau as a balloon and the laundry was their pin.”

Lewis has a whole quiver of pins, and she’s not slow to use them. She’s a sparky writer, though you’re left with a depressing suspicion that her eloquence will equip a small army of midwit commentators and would-be debunkers with blog-ready debating points. She makes even her less credible arguments so quotably, and with such energy, that it can only be a matter of time before we start seeing them ripped from context and posted on social media with a triumphant “THIS.” 

I’d like to think that this was not her intention, that The Genius Myth is not simply an extended argument ad hominem. It would be ­easier if Lewis appeared to admire—or even think very much—about the achievements of the individuals she demolishes so briskly. Ultimately, Anna Karenina is more interesting, and more important, than the question of whether Tolstoy’s private life would comply with twenty-­first-century campus ethics. Lewis, it seems, disagrees. “In my view,” she deadpans, “James Joyce had one big idea—what if ­novels, but harder to read.” She dismisses most of ­Picasso’s mature output and sees in Les Demoiselles d’Avignon only “piglike pink figures with distorted ­faces.” All good, facile knockabout.

Yet people still grapple with Ulysses and gaze, shaken, at ­Guernica. Shakespeare’s plays and Haydn’s symphonies may owe their enduring relevance to accidents, social factors, and fashions. But at the core of their appeal is an essence that many people, over many years, have found to contain meaning, beauty, and truth—however they choose to define those qualities. That’s no myth. Audiences have agency, and creative reputations are not—have never been—solely what the Patriarchy, the Establishment, or Big Tech (choose your bugbear) says they are. Otherwise, the music of Arnold Schoenberg would be more popular than Beyoncé. As Ringo Starr once put it: “I think the main point of the situation is that those pieces of plastic we did are still some of the finest pieces of plastic around.” The Genius Myth is an entertaining, ­eye-opening, and sometimes infuriating read. But ultimately it’s beside the point.

The post Finest Pieces of Plastic appeared first on First Things.

]]>
The Rest as History https://firstthings.com/the-rest-as-history/ Wed, 19 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113439 The Sabbath is making a comeback. Across the West, that most singular and ancient of weekly phenomena—a day marked by the absence . . .

The post The Rest as History appeared first on First Things.

]]>
Israel’s Day of Light and Joy:
The Origin, Development, and Enduring Meaning of the Jewish Sabbath


by jon d. levenson
eisenbrauns, 296 pages, $24.95

The Sabbath is making a comeback. Across the West, that most singular and ancient of weekly phenomena—a day marked by the absence of market forces, digital devices, and the manic demands of professional ­productivity—is enjoying a ­curious renaissance. The notion that modern individuals desperately need systematic respite from the matrix of expectations and neuroses ­imposed on them by their world is no longer marginal. Perusing the ­smorgasbord of self-help gurus, parenting manuals, mindfulness retreats, and decluttering guides, one constantly encounters paeans to the digital detox, frequently termed a “tech ­Sabbath.”

That highly successful people in the twenty-first century are rediscovering the power, beauty, and necessity of a millennia-old biblical custom will come as a surprise to everyone except those who already observe it. For those of us fortunate enough to live our lives within this propitious rhythm, the only surprising thing about this rediscovery is its belatedness. The blessings of the weekly Sabbath, aptly described by Talmudic rabbis as “one-sixtieth of heaven,” require no elaboration beyond direct ­experience. The Sabbath’s power has been evident for millennia. Jews across the centuries have undergone every tribulation and oppression dreamt up by humanity. Yet every week, for twenty-four hours, they have returned—­liturgically and psychologically—to a state of numinous tranquility. They have rested, they have remembered, and they have affirmed their allegiance to a world in which time itself can be sanctified and the future redeemed. When Ahad Ha’am, a decidedly non-­religious Zionist thinker, remarked that “more than the Jews have kept the Sabbath, the Sabbath has kept the Jews,” his exaggeration was, at most, slight.

Given the antiquity and centrality of the Sabbath to both the Jewish and the Christian traditions, it is unsurprising that a number of modern authors have sought to explicate and re-enchant this weekly institution. Perhaps the best-known effort in this vein is The Sabbath (1951), by the neo-Hasidic philosopher Abraham Joshua Heschel. That slim volume, overflowing with fabulously poetic aperçus, invited its readers to taste eternity in the guise of sacred time. As a ­psychospiritual tour of the Sabbath’s “palaces in time,” it has yet to be surpassed. Honorable mention must also be made of Erich Fromm’s To Have or to Be? (1976), in which the psychoanalyst’s coruscating insights illuminate the Sabbath’s role in counterbalancing the acquisitive instincts of a disenchanted world. Yet our moment, characterized by a profusion of information and an impoverishment of wisdom, demands a reconceptualization of the Sabbath that is both spiritually sensitive and intellectually rigorous, attuned equally to the history and to the phenomenology of this remarkable ­institution.

Stepping into this role with characteristic erudition and eloquence is my eminent mentor Jon D. ­Levenson. In an academic world increasingly defined by methodological parochialism, Levenson’s work has always stood apart. He has the rare capacity to harvest from a wide range of academic fields—history, theology, biblical criticism, rabbinics, and philosophy—in service of extensive and penetrating considerations of enduring theological questions. Levenson writes with exquisite religious ­sensibility, conveying a sense not only of the outer forms of religious praxis but also of the strivings, emotions, and aspirations that accompany them. ­Israel’s Day of Light and Joy is vintage Levenson, evincing the breadth of scholarship, felicity of ­articulation, and twinkle-eyed wit with which he has reigned over seminar rooms and lecture halls for many decades.

The early chapters address a set of questions concerning the origins of the Sabbath itself. How and when did this institution arise? Does it appear consistently across the canon of the Hebrew Bible, or are there variants, slowly converging toward coherence? Do analogues exist in other ancient cultures? Levenson leads the reader across the landscape of accepted scholarship, even dipping a toe in the waters of speculation.

His most salient claim is that the šabbāt of the Hebrew Bible may have originated in connection with a Babylonian full moon festival (šabattu), becoming synonymous with the “seventh day” only through a lengthy process of theological convergence and calendrical standardization. He reminds us that the seven-day week itself is a non-­natural phenomenon. Unlike the day (solar rotation), month (lunar cycle), and year (earth’s revolution), the seven-day structure appears sui generis. Its only general analogue in the ancient world exists within the Greco-Roman astronomical system, with each day being governed by a celestial body (hence the name of our modern seventh day, derived from “Saturn’s Day”).

Yet for all such similarities, ­Levenson’s most forceful point is the uniqueness of the theologically freighted biblical Sabbath. Ordained from the start as a moment of sanctity and transcendence, it has no true parallel in any ancient civilization. It is no mere “Day of Rest” (although cessation from work is important), nor is it a tribute to the powers of the planetary spheres that were once invested with deterministic power. Such a pagan cosmology, in Bertrand ­Russell’s arresting formulation, views humankind as “a small thing in comparison with the forces of Nature,” a pitiable slave “doomed to worship Time and Fate and Death, because they are greater than anything he finds in himself.” The Sabbath stands as a ritualized repudiation of inexorable temporality. The Sabbatical observer declares his faith in a vision of the cosmos in which humanity is not an ­isolated speck adrift in ­indifference, but a covenantal being of irreducible significance, bound from inception to God, community, and creation. This paradigm shift, foundational to the biblical revolution, is ratified every week by the imbrication of a non-natural unit of sacred time within an otherwise cosmological calendar. This jarring break functions as a subtle yet transformative simulacrum of the Bible’s insistence on mankind’s unique dual status: as a being confronted at once by both the majesty of a powerful universe and the sovereignty of its all-powerful author.

Levenson devotes much time to comparing Jewish and Christian approaches to the Sabbath, particularly around the question of legal regulation. Rabbinic tradition, from its earliest texts, surrounds the Sabbath with a latticework of prohibitions, customs, and finely wrought distinctions. The act of ceasing from labor, it turns out, requires extensive and exhaustive attention. For many Christian interpreters, this has seemed paradoxical, if not absurd. For how can a day of spiritual liberation be reduced to a list of technicalities? At its worst, this rabbinic normativity is caricatured as a monument to desiccated Pharisaism, overcome by the ­liberating spiritualization of Christian grace.

Levenson rejects this caricature. Following a venerable line of halakhic thought, he argues that it is through law—precise, enforceable, and shared—that the Sabbath achieves its character. To observe the Sabbath is not merely to embrace a state of mind, but to enter into a communal choreography of rest. The laws of governing the Sabbath, correctly conceived, cannot be dismissed as mere crabbed legalism. Far from hindering spiritual praxis, they underwrite it. This tightly guarded and defined form of rest also safeguards the socio-ethical component of this institution, compelling as it does kings and paupers, seigneurs and peasants, humans and animals, to return to a prelapsarian state of freedom and fellowship. This radically egalitarian state becomes possible only within a matrix of normative constraint. If Heschel rhapsodized about the Sabbath’s “palace in time,” Levenson reminds us that these marvels of spiritual architectonics require floor plans.

As elsewhere, Levenson’s work here demonstrates a deep sympathy with rabbinic interpretations of the biblical texts, as well as with their traditionalist heirs in the medieval and modern canons of Jewish scholarship. Levenson’s competence in these frequently difficult textual traditions, and his sensitivity to their subterranean theological subtleties, are uncommon for a biblical scholar and, indeed, are lacking in some modern theologians. This sympathy leads him not only to oppose the classic Pauline approach to the Sabbath, but also to point out the lamentable failure of various reformist denominations of Judaism to preserve the “essence” of the Sabbath while eviscerating its legal frameworks. Some may view these commitments as a flaw in his analytical approach. Others will count them as a strength and a welcome counterbalance to the pervasive misapprehension that rabbinic hermeneutics are inimical to sound scholarship and reasoning.

The chapter with the greatest contemporary ­relevance—and the chapter this reviewer wishes could have been more extensive—is this book’s final one, which details the challenges posed by the Sabbath to modernity, and vice versa. Levenson notes that, for Orthodox Jews in particular, the Sabbath now functions as a weekly act of defiance against the instrumentalization of human life. The refusal to use technology, to conduct commerce, or to attend to digital devices forms a profoundly countercultural posture, a theological protest against the mechanization of existence. Highlighting and entrenching Heschel’s observations, Levenson notes that various forms of the “secular Sabbath” bear only the palest semblance to the genuine article. True Sabbath is not a tool for a more efficient Monday. It stands as a reminder that human life and dignity are ends in themselves, imbued with the eternity of the divine image and the attendant obligation to tend to those parts of our lives and personhood that cannot be priced on the market, yet have worth beyond number. In an age of overstimulation, the Sabbath is a rare opportunity to step off the treadmill that claims so much of our time and attention, and dedicate ourselves to restoring our tranquility, dignity, and, ultimately, our humanity.

To be sure, Levenson’s work is hardly the final word on the Sabbath. Some of his historical claims invite further scrutiny, and his alignment with rabbinic traditionalism will perhaps alienate some of his readers. The extent to which the Jewish Sabbath is truly equipped to function as a counterweight to the excesses of twenty-first-century life is a subject that demands more extensive reflection. Yet this book’s great strength is in its aspiration: It dares to treat the Sabbath as neither a museum artifact nor an ethereal phantasm, but as a vibrant historical, theological, and moral institution, with the power to alter individual and communal rhythms of life.

To understand the Sabbath is to grasp something elemental about Jewish history, biblical anthropology, and the metaphysics of time itself. It is to encounter a vision of life in which the world is not merely a field for toil but a garden of repose, to be received in ­humility and joy. Levenson’s luminous work offers us the best starting point yet for such an encounter. ­Israel’s Day of Light and Joy is a book worthy of the day it honors.

The post The Rest as History appeared first on First Things.

]]>
We Were Jesus Freaks https://firstthings.com/we-were-jesus-freaks/ Tue, 18 Nov 2025 10:00:00 +0000 https://firstthings.com/?p=113315 ”Hey you, I’m into Jesus,” I sang, driving to school in my 1988 Buick Park Avenue, the windows rolled down, wind whipping through my hair, the bass rattling the...

The post We Were Jesus Freaks appeared first on First Things.

]]>
Hey you, I’m into Jesus,” I sang, driving to school in my 1988 Buick Park Avenue, the windows rolled down, wind whipping through my hair, the bass rattling the cheap speakers that blasted DC Talk’s “Into Jesus.” As always, the dial was fixed to my local Christian radio station. I was seventeen years old.

I graduated high school in 1999, having spent a decade immersed in the evangelical subculture. It was a period marked by an astounding amount of culture-making. Yes, a great deal of the music and art was kitschy or derivative. But the subculture was formative. For many of us who grew up in evangelical homes, it wasn’t a “subculture” at all. It was simply our culture, the songs and stories and images that populated our world.

There was a fruitful tension in those years. Some students felt the impulse to distinguish themselves from the world by consuming only Christian music. And yet, the Christian music of the time often seemed driven by a desire to show how Christians can be just as “cool” as the world. The paradox of what theologian Lesslie Newbigin called the “missionary encounter” lurked behind the debates of those years: The salt must not stay in the saltshaker, but its engagement with the world must be missionary, retaining its saltiness. 

Parental concern played a big role in creating demand for Christian versions of popular cultural idioms. The mists of nostalgia can obscure how far the envelope was pushed in the nineties when it came to vulgarity and sexuality in music and movies targeting young people. Tipper Gore, the wife of the Democratic Party’s vice president, led the charge for warning labels on albums with objectionable content. Pornography escaped the bounds of dirty magazines and became available on screens in every home with AOL dial-up. Raunch filled the big screen: Seven of the top twenty movies in 1999 were rated R, including lewd teen comedies and salacious blockbusters. American Beauty, a film about a middle-aged man who fantasizes about his daughter’s cheerleader friend, won Best Picture at the Academy Awards. The nihilism in rock music, the sensuality in pop, the promiscuity normalized in Friends, one of the decade’s biggest TV shows, the excesses of a hedonistic culture—Christian parents had good reason to look for “safe” alternatives for their children. Evangelical culture-makers stepped into the gap, providing music, media, books, and even outlets for activism.

Contemporary Christian Music played a central role in the evangelical subculture of my youth. CCM provided the soundtrack for my adolescent life. And I’m not alone.

Contemporary Christian music flowed from the Jesus People movement that flourished along the West Coast in the late 1960s and early 1970s. At first a largely church-based phenomenon, with performances in small and medium-sized venues, by the 1980s, the industry saw songs and artists selling so successfully they broke into mainstream consciousness. Amy Grant epitomized crossover success. She had two #1 Billboard Hot 100 singles and multiple hits that dominated the adult contemporary charts. As CCM matured, it became an industry genre of its own.

2025 marks the thirtieth anniversary of two albums that transformed the landscape of Christian music. 1995 was the year the Christian rap and rock trio DC Talk released Jesus Freak. It was also the year Jars of Clay, an alternative rock band, released their self-titled debut album. Both albums produced multiple hit singles and became ever-present fixtures in youth groups across the country. 

DC Talk formed in 1987 at Liberty University: Toby McKeehan was the rapper, Michael Tait the soulful crooner, and Kevin Max contributed a quirky vibrato to the lyrics. Their 1992 rap and hip-hop album Free at Last blasted a counter-cultural narrative: Christian teaching is serious—and cool. “Luv is a Verb” sought to rescue the word “love” from meaning “sex.” “Socially Acceptable” lacerated American culture, warning against “justifying” sin and “synchronizing to society’s ways.” 

In 1995, DC Talk reinvented themselves with ­Jesus Freak, which blended rock and rap (“So Help Me God,” “Like It, Love It, Need It”), pop (“Between You and Me”), and even folk (“In the Light” and “What If I Stumble?”). The mix somehow gelled—an “in-your-face” combination of rock and rap, raging guitars and soaring harmonies hellbent on keeping young people from hell. The album’s title track was a deliberate throwback to the origins of CCM, the California revival in the 1970s, but “Jesus Freak” (clearly influenced by Nirvana’s “Smells Like Teen Spirit”) had a compelling, power-chord driven enthusiasm impossible to ignore. Thousands of teenagers embraced this anthem, proudly proclaiming the freakiness of their faith in the eyes of the secular world. 

In contrast to DC Talk, Jars of Clay’s first album was more brooding—the lyrics more ambiguous, the music more acoustic, employing a wide range of instruments and strings in a folk-rock collection that explored grace and love, sin and sadness. There were songs you could jam to (“Flood,” which played over the credits of the film Hard Rain with Morgan Freeman), but the majority were simple, reflective expressions of faith (“Like a Child” and “Love Song for a Savior”). At the time, youth group kids typically identified with either DC Talk or Jars of Clay, each band’s style reflecting different teenage personalities. I was more of a Jars guy, enthralled by the musical arrangements, always mulling over the meaning of the lyrics, drawn to the introspection of songs like “Worlds Apart” as opposed to the guitar bluster of DC Talk, though plenty of young people enjoyed both bands (as did I).

The output of CCM in the late 1990s was astonishing. There was inspirational pop from groups like 4Him, Point of Grace, Avalon, and singers like Jaci Velasquez and Michael W. Smith. There was rock from Audio Adrenaline, the Newsboys, MxPx, the Supertones, and Switchfoot. Even in predominantly white churches, Christian teens knew Kirk Franklin’s “Stomp”, danced along to Out of Eden, or marveled at the vocal prowess of CeCe Winans. 

The arrival of the folk-rock band Caedmon’s Call signaled the rise of the gospel-centered movement soon to be called the “Young, Restless, and Reformed.” Caedmon’s Call extolled God’s grace, described human depravity, and emphasized God’s sovereign purpose in making all things new. I remember a discussion with a friend’s father who took issue with Caedmon’s Call’s “Shifting Sand,” claiming the lyrics were a false portrayal of the Christian life:

My faith is like shifting sand,
Changed by every wave,
My faith is like shifting sand,
so I stand on grace. 

I defended the song, saying it’s the rock of grace that matters most, not the size or sincerity of our faith. Whoever was right in that argument, the fact that evangelicals were debating the lyrics of Christian pop songs is telling. Not all the culture-­making of the decade can be dismissed as theologically shallow.

CCM created a distinct and thick atmosphere. The whole industry served as an answer to the question Larry Norman posed in one of his most famous songs from the late 1960s, “Why Should the Devil Have All the Good Music?” An Augustinian current ran through these attempts to “plunder the Egyptians” and repurpose musical forms from the world for eternal ­purposes. Before the industry shifted almost exclusively toward worship music, it manifested a Kuyperian mentality that mined all areas of life for signs of God’s glory and goodness. 

The catalog of singer-songwriter Steven Curtis Chapman is a case in point. The man (we called him “SCC” or “S-C-squared” or just Steven Curtis) could sing about the grace of God or the hectic life of a parent or the struggles of knowing how best to love his wife. Chapman gave musical expression to the truths about God’s love and grace that I learned in my local church, shaping my young missionary heart through songs like “For the Sake of the Call” and “Whatever,” a favorite in the year before I moved overseas for mission work. 

When artists and bands achieved mainstream success—Sixpence None the Richer’s pop song “Kiss Me” hovered near the top of the Billboard Hot 100 the month I graduated from high school—we took it as a badge of honor. Such success underscored the perennial question faced by musicians: Am I a “Christian artist” called to create for the “Christian” niche, or am I an artist who happens to be Christian and therefore am free to write about anything through the lens of redemption, without regard to labels?

By the early 2000s, the Christian music scene was undergoing radical change. Napster and other online file-sharing platforms had transformed the music industry. Christian radio shifted toward songs that could be featured in Sunday morning worship. The “special music” before the sermon, once a mainstay in the evangelical “order of worship,” where someone, often a teenager, would sing a popular Christian song backed by a track, lost its appeal. Worship songs and anthems began to dominate.

Many of the criticisms of CCM today (theologically anemic, chasing secular trends, forgettably safe) were already being voiced in that era. Charlie Peacock’s 1999 book, At the Crossroads: An Insiders Look at the Past, ­Present, and Future of Contemporary Christian Music, criticized the industry’s promotion of a “conformity” that “produces legalism, performance-based acceptability, and stunted, uninspired imaginations” and thus fails to offer artists “a sufficient theological or ideological foundation from which to create music, ministry, and industry.” Peacock’s call for artistic excellence was influenced by Francis and Edith Schaeffer, the apologists whose influential ministry at L’Abri in Switzerland took art ­seriously, especially its power to express goodness and truth. Years later, when I joined a writers’ group in Nashville that included Charlie’s wife, Andi, I saw that commitment to artistry—the desire to create something not merely popular but good, a gift of truth and beauty. But back in the nineties, most parents were just happy their kids were listening to music with lyrics that were, if not theologically substantive, at least not morally inappropriate. 

Radio dramas and family-oriented shows also emerged to shape evangelical households. Focus on the Family, led at the time by James Dobson, published manuals and books on marriage and family life, mixing secular psychology and biblical principles, all with an eye to strengthening the nuclear family in an age marked by moral indifference. There were magazines for young men and women (Breakaway and Brio). 

But Focus on the Family’s audio show Adventures in Odyssey had the biggest cultural impact. It debuted in 1987 on the radio and became a fixture in evangelical households by the mid-1990s, available through cassette tape collections. The series was set in the fictional town of Odyssey and centered on John Avery Whittaker, the owner of an ice-cream and discovery emporium called Whit’s End. 

It’s difficult to categorize Adventures in Odyssey. Some episodes were serialized drama and suspense. Others followed the story structure of a family sitcom. Still others used the invention of the “Imagination Station” to transport kids back in time to famous moments in history or into important Bible stories. It wasn’t just a show; it was a world, with memorable characters, running gags, and moral lessons—all designed to reinforce a broadly Christian, family-friendly, patriotic framework for life. The engaging scripts, the talent of the voice actors, and the amazing sound effects turned Adventures in Odyssey into the cultural wallpaper for evangelical homes. When I babysat my younger siblings, I’d pop in an Odyssey tape for my brother at bedtime as I turned off the bedside lamp. Countless families slid cassette after cassette into car stereos on long road trips, the episodes blending with the hum of tires on asphalt and sibling squabbles in the back seat.

Adventures in Odyssey drew its share of detractors. Some complained it prioritized moral lessons over clear gospel teaching, avoiding theological specifics beyond the basic evangelical consensus. Others criticized it for resolving difficult issues (an eating disorder, for example) in the time frame of a typical situation comedy and for prioritizing the homogenous social cohesion of a Midwestern Mayberry-type neighborhood at the expense of uncomfortable topics among white evangelicals, such as racial injustice or intergenerational ­poverty. Some of these critiques were valid, yet it’s hard to find a better example of influential ­cul­ture-­making in the evangelical world. Its fans are right to ­celebrate ­Adventures in Odyssey for standing head and shoulders above any audio drama coming out of the secular world, while critics are also right to chuckle at the fact that at the end of the twentieth century it was radio where evangelicals made their creative stand.

If Adventures in Odyssey didn’t engage an evangelical family, other entertainment options were available. The age of the VCR and DVD player made it possible to bypass the cinema and broadcast TV, delivering faith-friendly options directly to consumers. McGee and Me! was a popular series about an adolescent boy, Nicholas, and his cartoon friend, McGee, who learn moral lessons or face ethical dilemmas together. Christian videos and cartoons (such as Bibleman) appealed to evangelical parents who wanted to offer their children something “safe” yet contemporary, an option better than reruns of family-friendly shows from the 1950s and 1960s. 

But it was VeggieTales that took the evangelical world by storm with success that spilled into the mainstream. Created by Phil Vischer, with early episodes featuring writing from Eric Metaxas, VeggieTales was launched in 1993. It featured memorable songs, zany storylines, funny characters, and a conclusion that sought to impart a Bible verse or moral lesson to the kids watching. I say kids, but as teens we loved them, too. Crowded into the sticky seats of a youth-group church van, stinking of sunscreen and sweat, we’d shout-sing VeggieTales songs over the rattle of windows that wouldn’t quite roll all the way up or down.

The enduring impact of VeggieTales was so profound that, recently, when President Trump reposted an AI-generated image of a monolithic gold statue of himself in the Gaza Strip, many ­middle-aged evangelicals instantly recalled the most creative of the early VeggieTales episodes. “Rack, Shack, and Benny” retold the story of Shadrach, Meshach, and Abednego in the fiery furnace, with a magnificent chocolate Bunny as the monstrosity the vegetables are commanded to worship. The shared experiences within the evangelical subculture of the nineties were so strong that, thirty years later, a stray social media image caused Christian Gen Xers and millennials to start humming along to “The Bunny Song.” 

VeggieTales had its critics. The secular world resisted the intrusion of biblical morals into kids’ entertainment. Conservative Christians feared that the fantastical reinterpretations of Bible stories might lead kids to think of the Bible as just a bunch of fairy tales or a book of moral lessons, not all that different from what you’d find on PBS Kids. In 2008, Russell Moore pointed out the impossibility of telling the story of Jesus’s crucifixion: A splattered tomato wouldn’t fit the show’s style. A few evangelicals were scandalized when a baby carrot portrayed Jesus in a Christmas episode (which mercifully ended before Herod’s massacre). But these criticisms didn’t carry the day. Most Christian parents were just happy to see family-­friendly entertainment with creativity and production values as good as anything you could find in “the world.”

Historian David Bebbington’s well-known description of evangelicals identifies four traits: biblicism, crucicentrism, conversionism, and activism. The last of these informed the evangelical ethos of the nineties in important ways. There was often a political current running through the culture-making of the time. Focus on the Family lobbied Congress; pastor and university president Jerry Falwell sent out videos peppered with Republican talking points (and, on occasion, conspiracy theories); Southern Baptists boycotted Disney because of the company’s leftist agenda; men gathered in Washington for Promise Keepers; and the sins of Bill ­Clinton made headlines. Fighting for the soul of the country was championed as a demonstration of faithfulness. Churches were asleep, and Christians apathetic. It was time to wake up! Carman, a CCM star, captured the urgency in his 1993 song “America Again”: 

If you wanna see kids live right
stop handing out condoms
and start handing out the Word of God in schools.

But not all the activism was political. Christian performers often paused their concerts to share opportunities for compassion ministry: feeding the hungry, building wells, or providing financial assistance to children in far-flung places of poverty. Teenagers went on mission trips—if not overseas, then to other states, where we did puppet shows, tutored kids, fixed roofs, or helped repair homes for families putting their lives back together after a natural disaster. 

The most culturally influential example of evangelical activism was a direct opposition to the sexual libertinism of the time. True Love Waits was a campaign initiated by Southern Baptists. It quickly outgrew its denominational origins and became a national phenomenon. All sorts of practices and cultural artifacts grew up around the pledges teenagers made to remain sexually abstinent until marriage—purity rings, a special ball to celebrate virginity. Christian singers wrote songs and gave testimony about saving sex for marriage: Jaci ­Velasquez, “I Promise”; ­Rebecca St. James, “Wait for Me.” The Jonas Brothers wore purity rings for a time, as did ­Miley Cyrus. Some rejected dating altogether, advocating for a culture of demur courtship. Joshua ­Harris’s best-selling I Kissed Dating Goodbye became the quintessential example of what would later be called “purity culture.” 

The larger society was influenced in some measure by the spread of purity culture. Between 1995 and 2002, sexual activity declined significantly among teenage girls and boys. The rate of teen pregnancies fell. The percentage of teenagers who had sexual intercourse dropped for both boys and girls, from 43 and 38 percent to 31 and 30 percent. And while some studies show that teens who pledged themselves to purity did not behave all that differently from teens who did not, it’s hard to deny a shift in cultural norms about teenage sexuality, even if True Love Waits can’t be considered the single cause for that shift.

Years later, many of us would re-evaluate the promises and pressures that purity culture placed on our generation. At times, virginity was ­equated with godliness in ways that exceeded biblical teaching. There were unequal expectations for young women as compared to young men, and the sloppy rhetoric of some youth pastors could imply a teenager’s worth was tied to keeping the pledge. A sexualized prosperity gospel took hold: If you wait for sex until marriage, you’ll be blessed with a great sex life and a better marriage later. One of the earliest sermon clips to go viral in the era of social media featured pastor Matt ­Chandler’s story of a church minister who, in making a case for sexual abstinence, passed a rose around the room, only at the end to hold up the bruised, battered flower with petals barely hanging on, and say, “Who would want this?” Chandler said everything in his soul cried out in response: Jesus wants the rose! Thats the point of the gospel! 

While the excesses of purity culture deserve pushback, a sober assessment of that era should still appreciate the aims of our parents and grandparents who encouraged us to cultivate a holy aspiration to present our bodies as living sacrifices.

I Kissed Dating Goodbye wasn’t the only book that made a mark in the nineties. Christian fiction was prominent. By the early 1990s, evangelical books and merchandise was a $3 billion business. In Reading Evangelicals: How Christian Fiction Shaped a Culture and a Faith (2021), journalist Daniel Silliman describes the evangelical bookstore as “a mechanism that forms a group as a group, bringing people into a conversation. . . . Imagination happens in a bookstore.” 

Christian bookstores stocked their shelves with colorful trinkets, toys, books, and alphabetized rows of cassettes and CDs. Evangelical moms and grandmas browsed the latest addition to Janette Oke’s Love Comes Softly series, historical fiction with heartfelt storytelling, gentle romance, and an emphasis on Christian values. Visitors stepping into a Christian bookstore were enveloped by instrumental praise music quietly playing overhead, creating an atmosphere that felt unmistakably safe and family-friendly.

But “safe” wasn’t the whole story. By the late 1980s, author Frank Peretti was writing novels that placed religious concepts within the horror genre. His books shied away from bad language and anything too gory, but there was plenty of cataclysm. His view of a world engulfed in spiritual warfare seized the imaginations of evangelicals across the country. This Present Darkness became a major success for the burgeoning publisher Crossway, selling more than two million copies in ten years. Once known primarily for publishing Francis Schaeffer’s work, Crossway’s editors sensed that Peretti’s novels fit well with the latter part of Schaeffer’s career—a heightened sense of the culture-war showdown, transposed into a charismatic key that aligned well with the growing Pentecostal and nondenominational movement of the time. Readers were invited into a world of the powers and principalities at war above and around us, with our prayers making all the difference. 

Peretti’s novels appeared during the Satanic panic of the late 1980s. Rumors spread throughout the country of ritualistic, occult activity underlying abuse, especially of children. Understandably, ten years later, many evangelical parents were ­initially wary of J. K. Rowling’s Harry Potter books, the first of which appeared in 1997. Long before Rowling was a darling of evangelicals for standing ­courageously against transgender ideology, the world of witchcraft and wizardry she’d created made her suspect. As my friend Shane Morris once quipped, “A lost time traveler in the early twenty-­first ­century could just about pinpoint the year by asking who hates J. K. Rowling and for what.” By the time the final Harry Potter book arrived, many evangelicals had begun to recognize the Christian imagery in the series, and eventually Hogwarts found a place on the bookshelves in evangelical homes, next to C. S. Lewis’s Chronicles of Narnia and J. R. R. Tolkien’s Middle-Earth.

But it wasn’t fantasy that captured the imagination of most evangelicals by the end of the nineties: It was eschatology. The first book in Jerry Jenkins and Tim LaHaye’s Left Behind series appeared in 1995. Based on a quasi-dispensational view of the end times, Left Behind imagined a rapture of the Church at the start of a seven-year tribulation marked by the rise of the Antichrist and the race to Armageddon. The series was wildly successful, selling more than eighty million copies. Ironically, the Left Behind series that popularized the dispensational view of a pre-tribulation rapture was partly responsible for the rapid diminishment of this perspective as a respectable interpretation of the Bible’s prophetic literature in the evangelical academy. The new generation of pastors and seminary students opted for interpretations of the end times that had historical pedigrees. 

The rise of the internet marked a wholesale transformation of the evangelical subculture. By the time dial-up gave way to Wi-Fi in every home, the resistance to Harry Potter had largely melted away, the vibrant Christian music industry of my youth had blended into worship music, and the evangelical energy shifted from building a separate culture to figuring out how to survive in an increasingly hostile secular monoculture. The idea of a protected evangelical enclave was plausible in the nineties. It grew steadily less tenable as the new century marched on. The Great Dechurching began in the late 1990s and only recently has appeared to taper off, in part because there are fewer ­churchgoers around who might leave. Although evangelical churches fared better than Roman Catholics and mainline Protestants in holding on to their own, no one can overlook the steady stream of people drifting away from faith. Rather than plundering the Egyptians, evangelicals felt more like they were being led into Babylonian exile.

The evangelical subculture has had its share of headline-making scandals, with artists, authors, and pastors often disappointing fans and followers who once esteemed them. Josh Harris, the author of I Kissed Dating Goodbye, renounced his books and his faith. Two of the three DC Talk members are notable for public falls from grace—Kevin Max now claims to be an “exvangelical” who follows the Universal Christ, and Michael Tait has acknowledged a double life that involved abusing drugs and sexually assaulting young men over the course of two decades. Any assessment of the legacy of evangelical culture-making in the nineties must grapple with the glaring failures—sin, injustice, and compromise. But a full reckoning of the remarkable cultural outpouring that shaped me as a young Christian must also note the quiet victories. That world shaped a resilient faith among young believers in a secular age, and many of us have benefited greatly. 

Though we may roll our eyes at its kitsch, its copy-cat tendencies, its “Jesus is my boyfriend” songs, we need to remember how much of secular entertainment from those years is equally forgettable. Christians don’t have a corner on cringe. The evangelical pop culture of the 1990s was far from perfect, but it helped to remind my generation of the basics of life and faith. 

Picture the interlocking elements of how this subculture worked. A fifteen-year-old kid wakes up in the morning and reads a Bible passage in a student study Bible recommended by his favorite Christian band. On the way to school, Adventures in Odyssey is playing—an episode about responding with generosity to someone who insults you. He arrives at school early in the morning chill, where he joins a tight circle of believers around the cold metal flagpole, hands clasped and prayers whispered into misty air, a demonstration of his desire to be salt and light. During study hall, he finds time to read a few pages of the Christian fiction novel he bought at the bookstore the week before. After school, he heads home, listening to Christian radio on the way. The songs warn against sin, ­champion Christ’s redemption, and speak about standing out as a believer in the world. It’s Wednesday, so there’s church that evening, and CCM is pumping through the stereo system when he arrives for worship, friendship, and Bible teaching. These are his people, the ones who tell him he’s not alone, that others are seeking the Lord and living the great adventure of faith. After church, there’s a hangout for the youth group at a friend’s house. Teens cluster around the TV; a familiar circle of friends are laughing at VeggieTales as pizza grease stains paper plates. Others discuss their plans to attend a Christian concert next weekend. Before bed, he slides off his faded WWJD bracelet, tossing it onto his bedside table beside his battered Bible, whispering a prayer beneath posters of his favorite Christian bands taped to the bedroom wall. Another day over. A new one ready to begin.

This thick world of culture-making in the nineties forged our identities as young believers. Today, the evangelical world is splintered and fractured, and the monoculture has disintegrated not only for evangelicals but for Americans as a whole. Yet within evangelicalism, we can see flashes of various subcultures re-emerging: the popularity of conferences that draw thousands of young men, or women, or pastors; songs by Forrest Frank or Brandon Lake that become mainstays at youth camps and revival meetings; the rise of YouTube personalities and apologists; the resurgence of old and new hymns; the success of Andrew Peterson’s Wingfeather Saga; books on spiritual formation that cross denominational lines. What’s often missing for young people in church is what’s missing for young people in the world—hospitable homes that make face-to-face interactions and spontaneous hangouts possible, places where the omnipresent smartphone isn’t a debilitating distraction.

Looking back to my adolescence, I feel profound gratitude—not merely nostalgia—for an evangelical subculture that earnestly sought to offer my generation a vision of faithfulness amidst cultural upheaval. I want the same for my kids and for future generations. The culture-makers of the nineties weren’t content to curse the darkness; they lit candles instead. I want to keep the candle burning.


Image by Ian Muttoo, licensed via Creative Commons. Image cropped.

The post We Were Jesus Freaks appeared first on First Things.

]]>
The Common Sense of John Searle https://firstthings.com/the-common-sense-of-john-searle/ Fri, 14 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113205 The twentieth-­century philosopher Wilfrid Sellars drew an influential distinction between “the manifest image,” which is the way the world is presented to us in everyday experience and common sense,...

The post The Common Sense of John Searle appeared first on First Things.

]]>
The twentieth-­century philosopher Wilfrid Sellars drew an influential distinction between “the manifest image,” which is the way the world is presented to us in everyday experience and common sense, and “the scientific image,” which is the description of the world offered by scientific theory. For some thinkers in the Western tradition, there is a sharp conflict between these images—think of Zeno’s view that motion is an illusion, idealists who deny the reality of matter, or those who insist that science has disproved the existence of free will. But other philosophers argue that, rightly understood, the two ­images are in harmony. ­Aristotle and Thomas Aquinas are examples.

Another is John Searle, who died on September 17 at the age of ninety-­three. Searle taught at the University of California at Berkeley for sixty years. He was active in the campus’s famous Free Speech Movement in the 1960s, though he came to criticize the excesses of student protesters. He made several first-rate contributions to academic philosophy, while also attaining an unusual degree of influence outside the field. The latter was facilitated by his crystal clarity as a writer and public speaker, and a personal style that was often as humorous as it was self-confident and pugnacious.

If it’s true that a man can be known by his enemies, then it tells us much about Searle that he had famous public disputes with the deconstructionist Jacques Derrida and the materialist Daniel Dennett. Both thinkers, in Searle’s view, peddled nonsense in the guise of sophisticated theory. In Derrida’s case, the sophistry involved putting forward the bold but absurd thesis that nothing exists outside of texts and then, when challenged, retreating into the perfectly reasonable but banal observation that nothing exists except in some context. ­Dennett’s sleight of hand was more subtle. He would pretend to be giving a materialist explanation of consciousness, but on closer inspection, Searle argued, he was actually denying that consciousness existed.

The controversy for which Searle was best known, however, concerned Artificial Intelligence. According to a criterion for intelligence proposed by the mathematician Alan Turing, if a machine could, in response to questions, produce answers that were indistinguishable from those a human being might give, then we would have every reason to judge that it was literally intelligent. Searle rebutted this claim in his Chinese Room Argument.

Imagine that Searle, who knows no Chinese, sits in a room with a set of Chinese symbols and a rulebook in English telling him which combinations of symbols to give out in response to written questions slipped to him through a slot in the door. The rulebook does not tell him what the symbols mean; it simply allows him to mimic the behavior of a person who does. What Searle would be doing in this scenario, he argued, is essentially what a computer does: manipulating symbols according to the rules of an algorithm. The resulting mimicry, no matter how convincing, would not yield genuine understanding of Chinese. Likewise, what computers do can never amount to the operations of true intelligence, but only a simulation of it.

This much-debated argument is one of several by which Searle resisted the reductionist tendencies of contemporary philosophy, which are often fallaciously promoted in the name of science. In the middle of the twentieth century, logical positivists attempted to reduce all meaningful discourse to the descriptive language of formal logic and empirical science. Searle’s first book, Speech Acts, which built on the work of his teacher J. L. Austin, was among several key texts that led ­Anglo-American philosophers to take a more nuanced approach to language and its multifarious uses.

Just as Searle rejected the thesis that computers might exhibit genuine intelligence, so too did he criticize the popular view that the human mind is a kind of software implemented on the hardware of the brain. For one thing, the brain cannot, in Searle’s view, properly be characterized as hardware. Computers, he argues, are not naturally occurring objects, as stones, trees, and bacteria are. They are a human artifact, just as chairs, can openers, and airplanes are. Nothing is intrinsically a chair. An object counts as a chair only relative to an interpretation assigned to it by human observers. The same is true of computers. It makes no sense to explain the human mind by reference to the idea that the brain is computer hardware, since the brain counts as “computer hardware” only relative to the interpretation imposed by a human mind. The software model of the mind thus puts the cart before the horse.

Searle was highly critical of other versions of materialism as well, such as the extreme “eliminativist” thesis that if beliefs, desires, and other mental states cannot be explained in neurobiological terms, then they do not exist at all. Searle’s 1992 book The Rediscovery of the Mind is a tour de force, a sustained demolition of what had by then become a dogmatic reductionist orthodoxy in contemporary philosophy of mind.

In his books Rationality in Action and Freedom and Neurobiology, which appeared in the 2000s, Searle turned to the topic of free will. He argued against the idea that our actions are the necessitated effects of mental events of which we are merely the passive observers—as if everything we think and do simply happened to us. Choice, of its very nature, involves an irreducible and persisting self, which actively brings things about as a result of deliberation. There is a causal gap between our beliefs and desires on one hand and our behavior on the other, and only the self can fill that gap, by way of its agency. Searle’s view is that whether or not we can strictly prove the reality of freedom, we have no intelligible model of human action without it.

In his later work, Searle’s central interest was the nature of social facts in general and of social institutions in particular. His account of the way social facts are grounded in language, and language in turn is grounded in the mind, imparted a unity to what might otherwise seem to be disparate themes in his work. Searle was typically thought of as a philosopher of language and of mind. But by the end of his career, he had produced what would more traditionally have been described as a systematic philosophical anthropology.

The main weakness of Searle’s work is that he never developed a metaphysics that was as carefully worked out as his anthropology. Searle’s materialist critics often accused him of being a dualist in the Cartesian mold, a charge he always denied. He was as committed as the materialists were to the thesis that the natural world is all there is. He simply thought their way of fitting human beings into that world was simplistic. The trouble is that the conception of nature he shared with them, which he never seriously questioned, makes his position unstable. When he emphasized the continuity of human beings with the larger natural world, he sometimes sounded like he was offering just another riff on materialism. But when (as was more often the case) he emphasized how radically different the human mind is from everything else in nature, the charge of dualism was hard to rebut.

Like his materialist critics, ­Searle also had a tin ear for religion; he once suggested that only “bores” could think religion was still intellectually credible. Fortunately, his attitude did not deter him from the occasional friendly engagement with the Dominicans at Berkeley’s Dominican School of Philosophy and Theology.

Searle’s last years were very difficult. Charges of sexual harassment did enormous harm to his reputation and career, and he was stripped of his emeritus title at UC Berkeley. He essentially disappeared from public life. Speaking of the specific allegations that led to his downfall, his longtime secretary Jennifer ­Hudin has said (in an email that was published online after his death) that “after an extensive and intrusive investigation, these allegations were never found to be true.”

Whatever the facts and whatever Searle’s personal views about religion—or rather, all the more so in view of these things—many of us who admired him and benefited from his work pray earnestly that the reality of God was one further bit of common sense that he came to appreciate in his final days.


Image by Sascia Pavan, licensed via Creative Commons. Image cropped. 

The post The Common Sense of John Searle appeared first on First Things.

]]>
Fossilized Faith https://firstthings.com/fossilized-faith/ Thu, 13 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=113422 Christian Smith, a sociologist at Notre Dame, has a knack for turning academic research ­into books that resonate beyond the ivory tower. The concept of “moralistic therapeutic ­deism”...

The post Fossilized Faith appeared first on First Things.

]]>
Why Religion Went Obsolete:
The Demise of Traditional Faith in America

by christian smith
oxford university, 440 pages, $34.99

Christian Smith, a sociologist at Notre Dame, has a knack for turning academic research ­into books that resonate beyond the ivory tower. The concept of “moralistic therapeutic ­deism” (from Smith’s 2005 book Soul Searching, co-written with Melinda Lundquist Denton) has entered the lexicon. Why Religion Went Obsolete deserves to have a similar impact. Academically rigorous, yet accessible to pastors and educated lay Christians, it combines standard ­data sources and Smith’s own survey work with insightful cultural analysis to provide a powerful exposition of some of the most important cultural changes in post–Cold War America. 

There are many ways to frame the decline of Christianity. One lens is “secularization,” an account perhaps best articulated in philosopher Charles Taylor’s A Secular Age. ­Taylor gives a narrative spanning more than five centuries to show how we arrived at our current moment. My own “three worlds” model (First Things, February 2022) focuses on the period since 1964, and especially the Negative World that has been upon us since 2014, a world in which elite secular culture views Christianity negatively—or at least skeptically. 


Smith offers a useful new lens: obsolescence. Religion is now obsolete—that is, “most people feel it is no longer useful or needed because something else has superseded it in function, efficiency, value, or interest.” This doesn’t mean that religion is hated or that no one is religious, merely that the world has moved on.

The book notes that in addition to spiritual goods, Americans once expected religion to provide immanent ones. An example is the inculcation of morals, especially in children. Religion is supposed to help people cope with the ups and downs of life. It is expected to foster social harmony and national identity. Far fewer people today see traditional religion as either providing these goods or necessary to provide them. And, as we’ll see, many Americans have found new forms of spirituality, which they perceive as providing the spiritual goods they might once have sought in Christianity.

As Smith writes, obsolescence doesn’t mean extinction. “Some people still can and do use obsolete items because they are familiar, less expensive, viewed with affection, or as a matter of principle.” Traditional television is becoming obsolete because people have moved to ­on-demand digital streaming and social media. Many people still watch TV, but as a medium it is in decline, with viewers skewing older. Print newspapers are even more obsolete. At age fifty-five, I still take the Financial Times, Wall Street Journal, and New York Times in print. But younger generations have moved on.

In the short term, nothing stops you from using an obsolete product or practice. But it is no longer relevant to most other people’s lives, and eventually, social ­changes will make sustaining obsolete practices difficult. Horse and buggy transportation is obsolete: The Amish continue to use it, but doing so requires them to maintain a lifestyle that is detached from mainstream American life. Print newspapers may be even less sustainable. When they are no longer produced, people like me won’t be able to buy them at all.

Smith’s framing device helps us understand certain features of today’s religious landscape. First, it can explain why there are still many millions of practicing Christians, and why ­Christianity may well persist ­indefinitely into the future in America. Obsolete doesn’t mean extinct. Second, it allows us to see that the shift in views of religion in America wasn’t driven by any anti-Christian or anti-religious animus or plot, but rather, as I outline below, by a complex matrix of social changes. As Smith writes, “very little of what caused American religion’s obsolescence was planned or intended by anti-religious agents.” 

Smith also helps us understand that, over time, obsolescence is likely to make sustaining the practice of Christianity more difficult. Already, for example, youth sports leagues schedule games on Sunday mornings. These sorts of changes are a product of obsolescence, and have little conscious relationship to religion at all.

For Smith, social survey data show a break in social attitudes starting in 1991—close to what I identified as the start of the Neutral World in 1994. In fact, multiple independent analyses show that something did change in the culture of post–Cold War America. Religion had been a key weapon in the West’s moral war with the avowedly atheist Soviet empire, and its collapse was consequential because it allowed Christianity to be unbundled from what it meant to be a Western, liberal, democratic society. In Smith’s view, the shift to religious obsolescence was largely complete by 2009—a little earlier than my suggestion of 2014 for the start of the Negative World, but comparable.

The most impressive thing about Smith’s book is how many social trends and events he adduces—both inside and outside the church—in support of his thesis. By my count, he discusses forty-one different historical developments, ranging from the increasing number of women in the workforce to the rise of televangelism to global neoliberal capitalism to postmodernism. Most of these developments will be familiar to readers already, but together the effect is overwhelming.

For example, Smith ­discusses the impact of global neoliberal capitalism. Religion in America has implicitly been associated with settling down, rootedness, and stability. Neoliberalism, by contrast, values mobility (of goods, services, capital, ideas, and labor). It created a more competitive environment for careers, which required younger people to focus on and invest much more time in them. It imposed an ethic of dynamism and constant change—moving ­geographically, changing teams, learning new skills, and so on. Neoliberal values were also at odds with Christianity, Smith writes: “In a thousand and one ways, neoliberal capitalism socializes people to value autonomous individualism, continual innovation, material prosperity, market exchange relations, consumer satisfaction, endless competition, globalized cosmopolitanism, and the ­monetizing and marketizing of almost all aspects of life.”

As one might expect, Smith also discusses the digital revolution as a force undermining traditional religious practice. The internet has taken up large amounts of our time, created new ways of finding more flexible and less demanding community, made it easier to broadcast negative stories about religion (or religious figures behaving badly), broken the monopoly previously held by official sources of information, and given religious doubters the ability to build communities of their own and look for recruits. 

On topic after topic, Smith describes not just the phenomenon itself, but also how it worked to undermine traditional religion. Obsolescence results not from any one phenomenon or even a handful of them, but from their combined force. This fact makes his analysis difficult to refute: Even if a reader were, for example, to disagree with his view on ten of these trends, that would still leave more than thirty.

Not all of the book is equally compelling. When he arrives at the 1990s and 2000s, Smith uses the term “Millennial zeitgeist,” leaving unclear what exactly he means and even whether he is referring to the Millennial generation or just the millennium. The term seems to add little, even if Smith’s remarks on the era and his own survey findings are typically insightful.

“Traditional religion”—Christianity and Judaism—may be obsolete. But there are new forms of religious sensibility that are better aligned with today’s cultural conditions. We can think of it as a form of re-enchanted spirituality: an individualized, personally customized, syncretic form of religion that is spiritual but not religious, is seen as a means of discovering and realizing one’s authentic self, and often draws on Eastern religious influences or the occult. It is the kind of spirituality Rod Dreher wrote about in his book Living in Wonder.

Smith summarizes the point: “Religion did not become obsolete because secularity won the day. Religion lost out in good measure because alternatives that are ­actually more like religion than secularism emerged as cultural options that proved attractive to many post-Boomers. These ideas and interests replaced religion more easily than secularism could. Traditional religion has to compete against spirituality and occulture.” Religion’s decline, he argues, “has not been due to its farfetched belief contents—as most atheists and some secularization theorists would have it—but because of its own fossilized cultural forms that it was unable to shake.” 

The implications of Smith’s book are challenging for conservative American Christians whose strategies for the future have ­t­ended to involve doubling down on the very elements—the “fossilized forms”—of traditional religion that are now obsolete: rootedness, stability, family-centeredness, thick community, institutions, and historic practices and distinctives. This is the paradigm of Rod Dreher’s ­Benedict Option and, to some extent, of my own work. 

But if Smith is right, this strategy will probably only ghettoize the Church by making it even less relevant to mainstream society. It is the “build an ark” approach, which is designed to help the Church survive cultural change but which at some level involves giving up on or disengaging from society. 

An alternative, to reconstruct Christianity so that it aligns with new cultural conditions, would represent a new form of seeker sensitivity, which would come with its own downsides. It’s by no means clear whether it is possible to do this and remain fully within historic Christianity. The evangelical Emerging Church movement tried it in the 1990s by embracing the postmodernist moment, but it failed to build or sustain institutions and went into steep decline. What was left of it ended up abandoning Christian orthodoxy and merging with progressive Christianity. This is a cautionary tale about attempting to embrace social trends.

But regardless of how the Church responds to today’s world, Smith’s book represents a powerful cultural diagnostic, which American religious leaders need to read and take seriously. The implications for the future of American Christianity are profound.

The post Fossilized Faith appeared first on First Things.

]]>
Petrarch: Rime Sparse 81 https://firstthings.com/petrarch-rime-sparse-81/ Wed, 12 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=112992 I am so wearied by the ancient weight Of my own sins, by my bad habits’ load I go…

The post Petrarch: Rime Sparse 81 appeared first on First Things.

]]>
I am so wearied by the ancient weight
Of my own sins, by my bad habits’ load,
I go in fear that I’ll fail on the road
And fall into the hands of one I hate.

A great friend came to free me from this strain
With courtesy so high words fail its height;
And then He flew so far beyond my sight
I struggle to see Him again in vain.

But His voice still resounds down here today:
“All ye that labor now, behold the way;
Come unto me, if clear the close pass lies.”

What destiny, what grace is it, what love
Will give me wings and make me like the dove,
That I may rest and from the earth arise?

The post Petrarch: Rime Sparse 81 appeared first on First Things.

]]>
Strange Gods https://firstthings.com/strange-gods/ Wed, 12 Nov 2025 06:00:00 +0000 https://firstthings.com/?p=112997 We promised Joshua that we would serve the god who brought us to this land. Of course...

The post Strange Gods appeared first on First Things.

]]>
We promised Joshua that we would serve
the god who brought us to this land. Of course.
We took an oath and swore we wouldn’t swerve.
We’ve heard the stories all about the force
that crushed old cruel masters, blazed and towered
to shield us. We have the testimony here,
where Aaron’s sons keep watch. Our parents cowered
beside the mountain. Yahweh made it clear.
But they had manna every day, and quail,
through years of roaming to this destination.
Now rain’s been gone too long—our fields will fail—
and so we kneel on stone in desperation,
grasping, blistered lips pressed to a pole—
we might all starve if someone doesn’t bleed—
of course we know that Yahweh’s in control,
but what else can we do when we’re in need?

The post Strange Gods appeared first on First Things.

]]>