October 2025 Archives - First Things Published by The Institute of Religion and Public Life, First Things is an educational institute aiming to advance a religiously informed public philosophy. Thu, 18 Dec 2025 00:16:53 +0000 en-US hourly 1 https://firstthings.com/wp-content/uploads/2024/08/favicon-150x150.png October 2025 Archives - First Things 32 32 Enjoyably Evangelical https://firstthings.com/enjoyably-evangelical/ Tue, 07 Oct 2025 05:00:00 +0000 https://firstthings.com/?p=103385 Not many people need to write an autobiography—especially not philosophy professors. Not many people who take the time to write an autobiography will do so in under 200 pages . . .

The post Enjoyably Evangelical appeared first on First Things.

]]>
From Calvinist to Catholic
by peter kreeft
ignatius, 192 pages, $21.95

Not many people need to write an autobiography—especially not philosophy professors. Not many people who take the time to write an autobiography will do so in under 200 pages. But Peter Kreeft is unlike most people (and most philosophy professors).

For starters, there is his sheer output: more than one hundred books, covering everything from apologetics to surfing to Socrates to C. S. Lewis to the Catholic Catechism. Even more impressive is how he writes: pithy, humorous, opinionated, insightful, good-natured, and playful, yet serious and logical. Since Kreeft loves lists, permit me to list seven of my favorite lines from the book.


On the freewheeling way of being a kid in the 1940s: “Our play had more freedom, and our work and thought had more order.”

On young people today: “Teenagers are borderline human beings with their brains not in their heads but in their hormones or their smartphones; confused know-it-alls, dogmatically skeptical; spoiled brats who see their free gift of education as an imposed prison sentence. That is an exaggeration and a joke, of course, but if it weren’t at least 49 percent true, it wouldn’t be funny at all.”

On the frugality of the Dutch: “Dutch Calvinists make Scotchmen look like wasteful profligates. . . . How can you tell a Dutch house? By the paper plates on the clothesline.”

On the vacuity of modern praise choruses: “They make me feel excruciatingly embarrassed, as if I were watching a nudist colony of drunk Neanderthals loudly ‘sharing their feelings’ about their feelings. Yes, I know the people who sing them are sincere. They are terribly sincere. So is Ned Flanders.”

On what atheists do not see: “Religion is indeed a crutch, as the atheists argue. And until the atheist confesses that he is a cripple, he will not be in the market for a crutch.”

On the ubiquity of DEI in higher education: “Yes, I am a critic of the ‘diversity, equity, and inclusion’ police, because I believe in all three of these things and they do not, any more than Robespierre and the ‘reign of terror’ in the French Revolution believed in ‘liberty, equality, and fraternity’ as they claimed they did.”

On why his autobiography says nothing about politics: “Today the majority of scholars, sociologists, psychologists, anthropologists, and journalists see politics as more powerful, more practical, more realistic, more true, and more important than religion, while those who actually believe what their religion teaches (whether it’s Christianity or another religion) must see things in the opposite way: that our politics is only our human invention and fashion . . . Therefore, when religion lets itself be influenced by politics, it always gets corrupted, while when politics lets itself be influenced by religion, it gets sanctified.”

Like I said, the man knows how to write.

The course of Kreeft’s life is not hard to trace. Born in 1937 into a loving home in Paterson, New Jersey, Kreeft learned Christianity from his parents and from his pastor in the Reformed Church in America. In 1955, he traveled west to attend Calvin College in Grand Rapids, Michigan. There he began his conversion—as the title indicates—from Calvinism to Catholicism. Almost the entire “autobiography” is the ­intellectual story of Kreeft’s conversion and an apologia for the Catholic faith. The rest of the story involves going to Yale, transferring to Fordham (where he completed his MA and PhD), marrying Maria, and then, in 1965, joining the faculty at Boston College, where he has lived and worked for sixty years. There is not much in this memoir about Kreeft’s personal life, especially as an adult—except for the general sense that—in the words of his biographical note—“He loves his five grandchildren, four children, one wife, one cat, and one God.”

As it happens, forty years after Kreeft was born and raised in the Reformed Church in America, so was I; and forty years after he left home to attend Calvin College, I left home to attend Calvin’s rival, Hope College—the RCA school in Holland, Michigan that Kreeft should have gone to! When Kreeft writes about the good and bad of tribal loyalty among the Dutch, the seriousness of Calvinist piety, and life in Grand Rapids, I know of what he speaks. Although we grew up two generations and seven hundred miles apart, we had many of the same cultural experiences: “Everyone I knew was Dutch, frugal, conservative, and Republican.”

But of course, our stories diverge in one absolutely fundamental way. My story has not been from Calvinist to Catholic but, one might say, from Calvinist to more Calvinist. As a Presbyterian pastor, I am not convinced by Kreeft’s apologetics for papal supremacy, or apostolic succession, or Marian devotion, or sacerdotalism, or Purgatory, or indulgences, or a Tridentine interpretation of justification, or a dozen other aspects of Catholicism, for which he argues with admirable brevity and lucidity. If a man is to be a Catholic, however, I hope he is the Peter Kreeft kind—full of good cheer and with a Protestant-like appreciation for the Bible, a personal relationship with Jesus, and the need for grace (even if we disagree on the nature of that grace).

Obviously, this is not the place to reengage the many areas where Catholics and Protestants continue to differ. But since most of the book is about Kreeft’s journey from Calvinist to Catholic, I think a few ­reflections are in order. Let me start with the ­area where he lands the best ­punches.

As is often the case when Protestants swim the Tiber, Kreeft’s initial attraction to the Catholic Church came by way of beauty and mystery. For Kreeft, this movement began with seeing St. Patrick’s Cathedral and reading St. John of the Cross. “My heart started moving down the road to Rome earlier than my head,” he writes. Kreeft found the poetry of the liturgy to be like fine wine, satisfying a deep thirst for joy and beauty. His aesthetic sensibilities resonated with architecture that was expansive and romantic rather than frugal and utilitarian. Whereas Calvinist churches were plain, humble, homely, and unpretentious, Catholic cathedrals felt elaborate, spectacular, and otherworldly. The eventual move to Catholicism was not a change from barrenness to pregnancy, he insists, but from appetizers to a main course. In Kreeft’s telling, Catholicism gave him the “more” he did not know he was missing.

One does not have to agree with all of the “more” to accept that Kreeft has identified a real weakness in many strains of Protestantism (and of evangelicalism in particular). I love the Puritans and their emphasis on the word over spectacle and their concern for the “beauty of holiness,” but more than a few low-church Protestants almost wear their utilitarian aesthetic as a badge of honor. Too many of us consider ill-prepared, poorly crafted spontaneous prayers the only way to be sure the Spirit is at work. Kreeft is right to remind us that personal prayers may be precious to God, but they are not public liturgy. As Protestants, we would do well to think more deeply about how formality, tradition, structure, and space are critical supports in our commitment to worshiping a God who is high and lifted up and whose glory is beyond tracing out.

If transcendence, beauty, and mystery are Kreeft’s most compelling points for me as a Protestant, what about some of his less convincing arguments? Let me mention three.

First, I wonder whether Kreeft has rejected a caricature of Calvinism instead of the real thing. For example, he makes much of the fact that in Catholic theology, grace perfects nature. This is said to be a major weakness of Protestant theology. But it does not have to be. The great Dutch theologian Herman Bavinck anchored his whole Neo-Calvinist project in the conviction that grace restores nature. The same idea can be found in Francis Turretin and in many other Reformed systematicians.

Similarly, I am not convinced that Kreeft has dealt fairly with ­Calvin’s doctrine of free will. Kreeft believes strongly in God’s providential supervision over all things. He rejoices to know that God is the author of all our stories. He loves Romans 8:28 as a “castle keep” against doubt and despair. He stresses the importance of divine sovereignty. He also argues that Calvin was so “rationalistic” that he denied the other half of the great mystery: that we have free will even as God is sovereign. “If we are not free,” Kreeft argues, “then all morality and all justice is meaningless.” But this confuses several kinds of free will. Though Calvin denied that our wills, bound by sin, were free to choose salvation apart from God’s sovereign grace, he did not deny that we make real ­choices. Calvin preferred not to call this prerogative “free will,” but he ­granted that freedom from necessity “so inheres in man by nature that it cannot be possibly taken away.” In Reformed theology, God must unilaterally cause us to be born again, but he always works by renewing the will, not by abolishing it. As the Canons of Dort (one of the high statements of Reformed orthodoxy) put it, God “does not act in people as if they were blocks and stones,” nor does he “coerce a reluctant will by force.”

Second, I wonder whether Kreeft has dealt fairly with the diversity that existed in the Church even before the Reformation. He insists that not one of the supposed corruptions Protestants have found offensive and unbiblical—the primacy of the bishop of Rome, the Eucharist instead of the Bible as the center of worship, the real presence of Christ in the Eucharist, the Mass as a propitiatory sacrifice, devotion to the saints and to Mary, her immaculate conception and sinlessness, her assumption into heaven, the doctrine of purgatory, and baptism as a saving act that works ex opere operato—“ever caused any protest or controversy or appeared as new” for 1,500 years. “If the Catholic Church today is full of so many heretical barnacles,” he asks, “why did no one in the world, no saint or sage or prophet, ever notice them and protest until Luther and Calvin?” Surely, the presence of the Waldensians, the Lollards, and the Hussites, not to mention the many conflicts between East and West, the many disagreements among the medieval schoolmen, and the many divergent views propounded by the Church Fathers and ecumenical councils undermine such a protest-­free version of Church history.

Three, I wonder whether my Catholic friends have considered what may be broken in Catholicism itself when most rank-and-file Catholics do not know the Bible and do not understand salvation by grace. This is not a criticism of Kreeft, so much as a different conclusion from the one he reaches. Kreeft presents an enjoyably evangelical version of Catholicism. And he does not describe his conversion as a rejection of Protestantism so much as a growing-up out of Protestantism into the adulthood of Catholicism. I suppose my question, then, is why it seems that so few Catholics grow into Kreeft’s version of Catholicism without the help of Protestantism.

Kreeft acknowledges that Protestants are superior to Catholics in three things: sermons, hymns, and familiarity with the Bible. Is it possible that the rejection of sola scriptura (the conviction that the Bible is the final and supreme authority on all matters of faith and practice) has meant, on a practical level, that Scripture is not deeply taught or emphasized in most Catholic congregations? If the word of God everywhere in the Bible forms the people of God, corrects the people of God, and tells the people of God who they are to be and what they are to do, then surely anything that leads the Church away from knowing, studying, and hearing that word is ­rightly regarded as suspect, even highly suspect.

In another moment of admirable candor, Kreeft acknowledges that when he has asked the Catholic students at Boston College why God should admit them into heaven, more than 90 percent begin with “I” statements like “I did my best” or “I am a kind person.” According to Kreeft, about half mention God’s mercy, and fewer than 10 percent mention Christ. Even if I were convinced that all the “more” of Catholicism was correct, I would have to wonder whether the benefits of the “more” had swallowed up the most important things that should come first. What does it profit a man to gain all the cathedrals of Europe if he forfeits his own soul?

At one point, Kreeft says he personally knows of at least twenty people who have converted from Calvinism to Catholicism, but not one who converted from Catholicism to Calvinism. All the data point in a different direction. Catholicism in America continues to decline, at a slightly faster rate than evangelical Protestantism. (My conservative Presbyterian denomination is one of the few that is growing.) In twenty-­three years as a pastor, I have seen literally hundreds of people come into the Calvinist fold from a Catholic background. Usually, the first thing they mention is that they never heard the gospel clearly in the Catholic Church and that they thought the essence of Christianity was about being a good person. Maybe they did not have ears to hear what was there in the Catholic liturgy or in the weekly homily. But it is at least worth considering why those who grow up with all the “more” of Catholicism often know little of the Bible and are routinely unaware that God saves sinners by grace through faith, and that this is the gift of God, not a result of works, so that no one may boast (Eph. 2:8–9).

Kreeft takes some big swipes at Calvinism throughout the book. This is no mealy-mouthed apologetics. And yet, even when I disagreed (strongly at times), it was impossible not to like the apologist. It is easy for people who believe strongly in their views to pursue their cause with scorn, with monomaniacal focus and self-importance. Kreeft has none of these attributes. Instead, he is always self-aware and self-­deprecating. He admits that he is better at reading and writing than at pastoring. He acknowledges that he hates small talk, which he has come to realize is a fault, not a virtue. He confesses that he is too often weak-willed, self-indulgent, and a whiney complainer. There is an important lesson here for polemical writers of all ages and on all topics. If you are determined to correct others, demonstrate that you are even more determined to correct yourself.

Wisdom doesn’t always come with age, but when the already wise get older, their wisdom is especially sanguine. I am still thinking about some of the pithy sayings at the end of the book—such as “I have learned to expect less of people.” And “I have learned the fragility of reason as well as the blindness of emotion.” And “I have learned the necessity of suffering and sorrow in order to know God and to become holy.” These are simple truths, but also profound. And there are many more like these.

I am thankful that Peter Kreeft has spent his career arguing for the reasonableness of Christianity. I am thankful that after all these years he still loves the first question-and-­answer from the ­Heidelberg ­Catechism. Most of all, I am thankful that he loves Jesus and wants others to love Jesus. I do not agree with the Roman Catholic faith. In fact, I think it represents a serious and dangerous departure in many places from the truths of Scripture. No doubt, Kreeft thinks my theological errors are serious as well. So if we ever meet, I suppose our conversation will take as its point of departure the saying that came to describe the relationship between him and his Calvinist father: “Hate the heresy but love the heretic.”

The post Enjoyably Evangelical appeared first on First Things.

]]>
Technological Nationalism https://firstthings.com/technological-nationalism/ Mon, 06 Oct 2025 05:00:00 +0000 https://firstthings.com/?p=103222 Some years ago, I was visiting an acquaintance in Cambridge, Massachusetts. He had a few friends over for lunch. One was a graduate student at Harvard’s Kennedy School of...

The post Technological Nationalism appeared first on First Things.

]]>
Some years ago, I was visiting an acquaintance in Cambridge, Massachusetts. He had a few friends over for lunch. One was a graduate student at Harvard’s Kennedy School of Government. That institution was founded in 1936 for the purpose of providing America’s leaders with a firm grounding in the latest ­social-scientific research, the better to inform their decisions and improve the lives of their fellow citizens. And so, it was something of a shock to discover that this young man was unable to express a particular loyalty to his fellow citizens. “Surely,” I pressed him, “you feel a greater responsibility for unemployed workers in Ohio than for those in other countries.” He heatedly dismissed my query, insisting that there can be no moral basis for the “tribalism” implied by my question. His duty was to all of humanity.

This cosmopolitan mentality is widespread among elites, and it troubles Alex Karp. In The Technological Republic: Hard Power, Soft Belief, and the Future of the West (co-written with Nicholas W. Zamiska), Karp inveighs against Silicon Valley’s lack of patriotic ardor. In 2018, employees at Google protested against that company’s contract with the U.S. Department of Defense. A few months later, the anxious denizens of Google’s C-suite decided not to renew the contract. According to Karp, the episode is typical. Tech titans (and their underlings) see “themselves as existing essentially outside the country.” They benefit handsomely from the American regime, which allows them to make fantastic profits. But they cannot rouse themselves to defend the country. Worse, they often demonize those who do so.

Karp is co-founder and CEO of Palantir, a tech company that is well known for the services it provides to the Department of Defense and U.S. intelligence agencies. In this regard, his polemics against the hypocrisies of the tech establishment may seem to be a form of brand promotion. Super-profitable social media companies spend billions to maximize user addiction, while noble Palantir serves the nation. But Karp has bigger fish to fry.  

Karp is worried about the triumph of the Last Man. The moniker comes from Nietzsche, who thought that the modern condition encourages two baleful trends. The first is a sentimental, enervating moralism, the sort that awards all runners trophies and shrinks from traditional notions of nobility and excellence. Karp exposes the nihilism implicit in the seemingly well-meaning nostrums of progressive morality. Take “inclusivity.” It promises to break down boundaries. But to do so we’re required to refrain from judgment, which means cultivating an agnostic attitude. Who can say what is “excellent” or “noble”?  A smiling pretense of equality must reign, the elimination of rank. Habits become habitual, and over time we find ourselves believing in very little. We have to, the better to promote inclusivity. Karp: “The problem is that tolerance of everything often constitutes belief in nothing.”

Thus, what seems like a grand and ambitious ­project—to midwife a multicultural society that transcends all differences—produces moral flatness. You have your “values”; I have mine. The implicit relativism unmoors the talented and ambitious members of society. There’s nothing authoritative to honor and serve, so one just gets on with life. The more cynical recognize that mouthing progressive pieties provides protection and helps you get ahead. This pattern of behavior was on full display during the height of Black Lives Matter hysteria. It’s not that ambition has died; rather, for the Last Men who run corporate America, the enterprise of life has little to do with strong beliefs strongly held.

The second trend in modernity is the triumph of a materialist worldview. What really matters in life is the maximization of utility, the attainment of comfort and safety. These influences encourage smallness of soul, a complacent, risk-averse existence. To seek something great is too dangerous. And why bother? As we learn in our college seminars, “greatness” is an illusion, a cultural construction designed to serve the material interests of the ruling class or some other center of power.

One can see the appeal of the Last Man. He promises peace, for if nothing is worth fighting for, then no one will fight. But it is a lifeless and post-human peace. It offers wealth, but no sublime objects of devotion. As Karp puts it, we live in a soul-numbing era in which the greatest ambition of tech entrepreneurs has been to build a world of “online advertising, photo-sharing apps, and food-delivery empires.”

The antidote to smallness of soul is commitment. By Karp’s reading, the second half of the twentieth century was characterized by an unbalanced animus against national identity. (I agree wholeheartedly with this reading of recent history.) Fearful of excessive nationalism, cultural leaders steered our imaginations away from traditional themes of civic solidarity. Karp quotes Richard Sennett, a representative establishment figure, who spoke of “the evil of a shared national identity,” and Martha Nussbaum, who during the heady End-of-History 1990s derided “patriotic pride” as “morally dangerous” and insisted that we pledge allegiance “to the community of human beings in the entire world.”

Roger Scruton dubbed this outlook oikophobia, a fear of affirming one’s own home. Karp insists that we must move beyond this unfortunate mentality. We can start with “a forceful and forthright conversation about national identity.” Ambitious Americans need to place before themselves a shared project, a collective enterprise that they can build up and serve with their ample talent. Rather than endless critique of America’s sins, we need to identify and champion an inheritance worth defending and amplifying.

Karp often defaults to business talk about “innovation,” but his deepest concerns are spiritual. He worries about the eclipse of the heroic dimension of life, and he intuits that we cannot gin up aspirations simply by calling for them. The human spirit responds to the lure of something greater, something transcendent. Higher things summon us, demanding our loyalty and asking for sacrifice and service. Therein lies the heroic spirit. Men do not aspire to great things in order to burnish their résumés. They do so when fueled by the burning power of love and devotion.

There are three main domains of love and devotion: God, family, and nation. Karp quotes Irving Kristol’s call for the West “to breathe new life into the older, now largely comatose, religious orthodoxies.” But in The Technological Republic, Karp mentions the things of God sparingly. About marriage and family he has nothing to say, which is sad. For the most common heroism in human affairs can be found in steady marital loyalty and sacrificial love of one’s children. 

Karp’s emphasis falls on renewing the American nation. This project is not only needed; it is an entirely realistic enterprise. The American people hunger for solidarity. They may be seduced by the cornucopia of consumer goods, but they do not wish to live as Last Men. They desire to be part of a great and noble undertaking, and there can be no doubt that America’s powerful mythology, if revived by intelligent and committed leaders, can satisfy that desire.

I can hear the chorus of objections. “Nationalism encourages xenophobia, promotes flag-waving excess, and leads to conflict!” Yes, all strong loves can go astray. But we do well to keep the words of the German poet Hölderlin in mind: “But where danger is, also grows the saving power.” Karp has made an astute discernment. We live in a love-deprived age. And he proposes the proper remedy. We must take the risks of commitment if we’re to revive the greatness of the ­human ­spirit.

The post Technological Nationalism appeared first on First Things.

]]>
Bring Back Beautiful Sermons https://firstthings.com/bring-back-beautiful-sermons/ Fri, 03 Oct 2025 05:00:00 +0000 https://firstthings.com/?p=103195 St. Augustine remains the Church’s greatest preacher. A single sermon of his can roam in many directions. That is a marvelous virtue, because it reflects reality: God’s glory is...

The post Bring Back Beautiful Sermons appeared first on First Things.

]]>
St. Augustine remains the Church’s greatest preacher. A single sermon of his can roam in many directions. That is a marvelous virtue, because it reflects reality: God’s glory is beheld in the breadth of the world, given to us in the breadth of Scripture. The homiletic habit of today—one point, please!—can be an enemy of divine beauty, the oratorical devolution of Benjamin Jowett’s dictum that any given text in the Bible has only “one meaning.” How wretched!

I recently revisited Augustine’s sermon on Luke 10:38–42, the story of Martha and Mary. Unlike many sermons I have heard on this text, Augustine doesn’t issue a simple imperative: Don’t be like Martha distracted by work; rather, contemplate Christ’s teachings or rest with him in prayer. It turns out, says Augustine, that we embody both Martha and Mary—indeed, we must. The two sisters are parts of our one life, the life that is “here” and the life that is “coming.” As always with Augustine’s sermons, there are examples and references to various aspects of his listeners’ daily life—their questions, challenges, pitfalls, and worries. Instead of pruning their concerns, he lets Mary and Martha illuminate it all. Life encompasses both anxious working and attentive resting. These are integrated in the singular life of Jesus, who works as a “servant” like Martha, just as he fulfills that labor and offers it to us in the restfulness of Mary. The vast world is taken up in the Messiah. It is a beautiful sermon.  

A sermon should be its own microcosm, taking in the world and presenting it back to us in turn. After all, the world is God’s: “The earth is the Lord’s, and the fulness thereof; the world, and they that dwell therein” (Ps. 24:1). Thus, in a good sermon, what is truly beautiful—the “beauty of the Lord” (Ps. 27:4) in all its perceptible glory—will unfold before our eyes and in our hearts as they behold a clear vision of God’s handiwork, revealed in his Son.

The world is vast, of course—and so, too, is a beautiful sermon. Augustine is hardly a craftsman of the infinite. For no sermon—or poem, or book, or treatise—could ever reach the measure of such a truth as God’s creative being, his gift-giving of our lives. Still, most sermons don’t even try. The issue is not the absence of well-turned rhetoric. There are many fine speakers in our churches. But their sermons, for all their nicely framed oratory, their winsome stories, their punchy demands, are nonetheless too deliberately small: at best unattractive and at worst ugly.  

We usually assume a sermon should have a “message,” whether doctrinal, moral, or political. Try to do that, avoid this, understand that. Any of these messages can (and should) be true, true enough in the context of their limited scope. Get the relation of the Son to the Spirit right; define self-control well in this or that situation; be faithful in such a way. None of this, however, constitutes the world, God’s world. These messages point only to bits and pieces, threads pulled out from a rich tapestry. Their colors are muted, and their ­purposes, in their isolated incompleteness, opaque. Unraveling the threads of life into discrete lines renders the world a drab place indeed. One reason I read less and less theology (and wilt more and more at sermon time) is because so much of it has left the world behind. What’s offered are only dregs.

Consider another story from Luke’s Gospel, the woman washing Jesus’s feet with her hair (7:36–50). Modern sermons on the topic pick out one true thread of the story: salvation through faith rather than works; Jesus’s atoning sacrifice symbolically rendered; love  en­gendered by divine forgiveness; narrow-minded judges of others; no one is excluded from God’s mercy. Augustine’s sermon on the text gets most of these topics in—all of them, in fact, as far as I can tell. His are not one-point sermons. In fact, he adds a lot more: hedgehogs and hares, the nature of the Church, heresy, ignorance, mission, and even the woman’s hair (which intimates the transience of possessions, mortality, eternity). Yes, there is a lot there. And, of course, as Augustine lays things out, other texts of Scripture proliferate, from the New and the Old Testaments, from psalms and epistles.

This last point is important. While Augustine’s sermon perhaps seems rambling to modern ears, because it doesn’t latch onto one thread, his forays among the words of Scripture are precisely what opens the sermon up to the larger reality of God’s work in the world. So much in one story. So much of God, and thus so much more beautiful. It is not the case that the more Scripture is quoted in a sermon, the more beautiful it is. Still, scriptural proliferation is always better than ­scriptural parsimony. There is, after all, a metaphysical force in play here: The Word works; so, too, do the Word’s words.  “His work is honourable and ­glorious” (Ps. 111:3). 

Preachers who agree with this in principle are rightly concerned, however, that listeners have limited capacities in the face of scriptural proliferation and of the vast world it unfolds. Let’s stick to one point, they think, and make use of a few well-chosen citations, if needed. Even better, let’s illustrate the matter with a good story that “brings the point home.” Maybe something from my childhood. 

The concern about saying too much is ­well-founded. We live in an ugly time, and we are inured to ugliness. True beauty is hard to swallow. But the answer to this problem is not to dish out more ugliness. Rather, we must train ourselves in beauty. The world must be opened up, bit by bit. And Scripture does this best. Preaching on the lectionary’s multiple readings—not just one—is a good way to begin. Let the words of one lesson knock upon the door of another’s—and then let us be so bold as to open it just a little and discover what comes rushing in and out. 

Luke 7:36–50 comes up in many churches’ weekly lectionary, paired with 2 Samuel 11:26–12:10, 13–15,  which pertain to David’s adulterous and murderous affair with Bathsheba and the child he conceives from his  criminal lust. There are sexual allusions here, as well as intimations of horrendous violence, wretched loss and mourning. Read with Luke 7, these verses from Samuel give new depth to the woman washing Jesus’s feet with her hair and tears, weaving together multiple threads: ­disobedience, self-discovery, Eve, Mary, the Church, families, desire, infidelities, forgiveness, mortality, and hard renewal. Surely, this is a far more beautiful world—God’s world in the emerging breadth of his grace—than one fervently built on “justification by faith alone,” or on the value of the sacrament of penance. No one sermon could follow all these threads. But weaving a few would press at least a little beyond our ugly complacencies.

I was once challenged to give a scriptural sermon on the “rock badger” (or “coney” in earlier English biblical translation). I never had either time or occasion to do so. (O theologian, put your money where your mouth is!) But I did ponder the matter and discovered how Jerome and Augustine, among many others, had made their own forays into the homiletics of the coney. The coney finds his way into many sermons, it turns out, and through a variety of pathways limned by traditional interpretive approaches (literal, moral, allegorical). By the eighteenth and nineteenth centuries, Christian poetry had taken up the habit of coney-­sermonizing quite happily. There’s much to say. An unclean animal according to Leviticus 11:5, the coney is also praised for hiding in the rocks (Prov. 30:26). And the “rocks”? We can imagine how easily the coney might find a home in Luke 7, the fallen and feeble woman now taking refuge in her Lord. And why not? Jesus was “with the beasts,” we are told (Mark 1:13), surrounded by angels and in a wilderness much like our own, beset, tempted, and nourished by the Word of God, like bread from heaven. To see him there is to glimpse the most beautiful world imaginable. 

It is a world to which we need to have our eyes and hearts opened. We work, laboriously; we listen, haltingly if in ravishment. Martha and Mary. Our lives are beautiful, too, when, with each aspect of our selves pressing and waiting with the two sisters, we roam about the Scriptures and receive the gift of the world’s “everything”—that is, the fullness of God’s created purpose, the King himself and the land he has made that is so wide, so wondrous, bestridden by Christ, and that “stretches afar” (Isa. 33:17).

The post Bring Back Beautiful Sermons appeared first on First Things.

]]>
Toward a New Humanism https://firstthings.com/toward-a-new-humanism/ Thu, 02 Oct 2025 05:00:00 +0000 https://firstthings.com/?p=103043 The most pressing question we face today is that of the Psalmist: “What is man?” So urgent is the question of man that the question of God has re-emerged...

The post Toward a New Humanism appeared first on First Things.

]]>
The most pressing question we face today is that of the Psalmist: “What is man?” So urgent is the question of man that the question of God has re-emerged among our intellectual and cultural leaders. Ayaan Hirsi Ali, Niall ­Ferguson, Paul Kingsnorth, and Russell Brand have all recently professed faith. Tom Holland and Elon Musk have commented on the importance of Christianity to culture. Most surprisingly, Richard Dawkins has claimed the mantle of “cultural Christian,” though he subsequently assured the world that reports of his spiritual evolution had been greatly exaggerated.

This development is not unprecedented. In 1950, Partisan Review ran a series titled “Religion and the Intellectuals.” The authors included Hannah Arendt, W. H. Auden, I. A. Richards, John Dewey, Robert Graves, A. J. Ayer, Sidney Hook, and Paul Tillich. The editors’ introduction could describe our own moment:

One of the most significant tendencies of our time, especially in this decade, has been the new turn toward religion among intellectuals and the growing disfavor with which secular attitudes and perspectives are now regarded in not a few circles that lay claim to the leadership of culture. There is no doubt that the number of intellectuals professing religious sympathies, beliefs, or doctrines is greater now than it was ten or twenty years ago, and that this number is continually increasing or becoming more articulate. If we seek to relate our period to the recent past, the first decades of this century begin to look like decades of triumphant naturalism; and if the present tendency continues, the mid-century years may go down in history as the years of conversion and return.

That last claim now looks wide of the mark. As significant as that revival of elite sympathy for religion might then have seemed, it did not initiate a long-term change in the overall direction of the West or the cultural fortunes of Christianity.

It is too early to know whether today’s revival will prove more than a fad. But like the earlier one, it indicates something about its context. Today, as in the aftermath of World War II, what it means to be human is contested. Those who perceive this are seeking a stable foundation for an answer, and they are seeking it in religion. The turn to ­theological matters is one response to an anthropological problem.

It was likewise in 1950, as the world emerged from the slaughter of war, facing the realities of the Holocaust and the spread of communism. Technology, too, posed new challenges. As Sartre commented, the advent of atomic weapons placed human beings in an unprecedented situation: They had to decide to continue to exist. Today the question of what it means to be human is, if anything, more vexed. Yet the shift in the rhetoric surrounding religion offers a glimmer of cultural and ­political hope.

To adapt a phrase from Nietzsche, the problem in our modern world is that man is dead and we have killed him. The concept of human nature is no longer subject to any kind of consensus, with obvious and catastrophic implications for society. Man has been abolished. So what has led to this abolition? Four causes suggest themselves: Human nature has been dismantled, disenchanted, disembodied, and desecrated.

The dismantling has various ­causes. The Christianity that shaped western ­societies’ anthropology was teleological, exemplified by the thought of Thomas Aquinas and summarized in the first question-and-answer of the Westminster Shorter Catechism: “What is the chief end of man? Man’s chief end is to glorify God and enjoy him forever.” Humanity was defined by a purpose that transcended the desires of any individual. Man had ends that defined him, some natural, some supernatural. But teleology has been rare in western thinking for generations. As ­science restricted its consideration of causes to the efficient and the material, understandings of the significance of the world, and therefore of human nature, were transformed. The most obvious examples are theories of evolution that eschew final causality. As they have shaped the modern cultural mindset, they have dismantled the notion of human exceptionalism.  When man has no God-given end, he has no stable or distinct nature. In killing God, we kill man.

The point was made by Nietzsche in his critique of Kant. One could not murder God and then expect human nature to do the late God’s work for him. If God had died, so had the notion that human beings were made in his image. Nietzsche’s program was pursued with vigor in the twentieth century by Michel Foucault, who dismantled the notion of human beings as self-constituting, rational agents. He saw them as the hapless products of networks of discursive power relations, a view that now rings out from countless university seminar rooms and underpins the rhetoric of identity politics, left and right.

The irony is that man’s very brilliance—­instanced by his intellectual curiosity, analytical abilities, and technological achievements—is what enables him to assert his unexceptional status. Confusion over the question “What is a woman?” has generated headlines in recent years, but it is the result of deeper confusion over the question “What does it mean to be human?” The answer seems to be: “We don’t know whether it means anything at all. Man is a directionless clump of animated cells, drifting through time and space.”

The disenchantment of human nature has ­many causes and takes many forms. Georg Lukács’s concept of reification points to some of them. The industrialized society and the bureaucratized state treat people as commodities, interchangeable with one another, lacking intrinsic value as individual persons. Industrialization detached labor from community significance. But blaming ­industrial capitalism alone is tendentious Marxism. The ideologies of the left have also played a role. The sexual revolution, that progressive watershed, has arguably done more than anything to turn people into things. And pornography, the most consistent iteration of the logic of the revolution, makes sex into a commodity, turning the actors on the screen into objects for consumers.

Then there is the transformation of abortion from an evil into a regrettable necessity and then into a right to be celebrated. Society’s moral imagination has been shaped by the logic of the sexual revolution, in which children are deemed accidental to sex; the humanity of the child in the womb has thus been stripped of its mysterious personhood. Much the same is accomplished by reproductive technologies such as in vitro fertilization (IVF) and surrogacy. Though these phenomena witness to the good, indeed very human, desire to have children, they also propose children as things, as consumer items made to order, not begotten in mystery. Motherhood too is transformed, with egg donation and surrogacy turning women into service providers or reproductive machines.

Recent reports that the United Kingdom is on the verge of being able to manufacture sperms and eggs in the laboratory are a harbinger of what is to come. Gene editing, embryo screening, and the commercialization of fertility all tend to the disenchantment and commodification of human life. The term “designer babies” reflects a plausible concept. Human beings, once begotten through the sexual union of two persons, are set to become consumer products. Persons have become things.

The third element of our culture of dehumanization is that of disembodiment. Radical feminism since de Beauvoir has tended to treat women’s bodies and procreative functions as problems that must be solved if sexual equality is to be achieved. This has been reinforced by technologies that subvert natural bodily ends, treating them as bugs rather than features. The body is a hindrance to liberation of the self.

Disembodiment is not restricted to sexual matters. The more our interactions are mediated by technology, whether Uber apps or social media sites, the less important our bodies become. Never in human history has life required less actual, physical, interpersonal engagement. The ascendancy of chatbots, AI, and robotics will only compound this. I can order a meal, ride in a taxi, even have a romantic conversation without ever having to engage another person.

The convenience hides the cost. George Orwell once sent an angry note to a publisher, denouncing Stephen Spender for his homosexuality. Eight months later, he wrote to Spender to apologize. Spender wondered what had led to this change of heart. The answer was that in the interim, Orwell had encountered Spender in person. He explained:

Even if when I met you I had not happened to like you, I should have been bound to change my attitude, because when you meet anyone in the flesh you realise immediately that he is a human being and not a sort of caricature embodying certain ideas.

Meeting Spender in real life humanized him. He became a person, not simply an idea. We might add that it also humanized Orwell. Bodily interaction is key here: Looking into the eyes of another person involves a degree of communion; it reveals that person as a human being, such as we are ourselves. Bone of our bone and flesh of our flesh, to borrow biblical language.

Today social media have universalized disembodied social interaction and perhaps made it normative for interpersonal engagement. Disembodied interaction often reduces interlocutors to the sum of the opinions they express and thereby turns them from real persons into aggregates of ideological fragments. No wonder social media can prove to be a cesspool.

The consequences are not restricted to social media. Part of what makes surrogacy plausible is the assumption that the experience of pregnancy is of little importance to the relationship of mother and child—that the maternal bond occurs postpartum. One might object that adoption assumes the same, but the cases are not parallel. In adoption, a couple takes the place of biological parents who should be there but for some reason are not. It presents itself not as a normative model for parenting, but as compensation for a privation. Surrogacy introduces a new model of what a parent is—a model in which gestation is accidental. And it reinforces the transformation of the body into a commodity.

The transgender issue is also pertinent, given that it involves a psychologized view of identity that marginalizes the sexed nature of the body and also the belief that bodies are simply raw material. Such ideas are plausible partly because of the way in which society’s intuitions about embodiment have been shaped by technology.

And then, once again, there is pornography. I noted above its role in disenchanting human nature. It also serves to disembody it—perhaps a counter-intuitive claim, given the central role of bodies in pornography. But pornography separates sex from relationships, indeed from physical contact with another person. Consumers enjoy that quintessentially embodied form of human behavior in a manner that detaches them from any of the ordinary concomitants of sex, from personal hygiene to the effort involved in romantic relationships, not to mention marriage.

Pornography also points to the fourth element of the modern assault on human nature: Human nature has been desecrated. Sex has ­historically been regarded as having sacred connotations. The Torah deals with sexual matters in terms of cleanness and uncleanness. The Qur’an prescribes postcoital washings. Paul in the New Testament sees sex as a matter of great importance, such that a man’s use of a prostitute involves a fundamental disruption of his humanity and his relationship to the church. To consider sex sacred makes sense, for in creating new life, it is the act that makes humans most like God. The sexual revolution did not simply make sex into recreation; it stripped it, and therefore the human nature of which it is a central part, of its sacredness.

The concept of desecration helps to clarify the delight some people take in the dismantling, disenchanting, and disembodiment of human nature, which those categories in themselves cannot explain. To wish abortion to be “safe, legal, and rare” is to hold a disenchanted view of human nature. But to glory in it as a “reproductive right” bespeaks an exhilaration that only transgression can deliver. Current pro-abortion politics are the politics of transgression, specifically the transgression of what was once considered sacred.

The same applies to death. Cultures have ­typically surrounded the end of life, no less than its beginning, with sacred significance. The Torah’s approach to sex and cleanness has parallels in its regulation of the treatment of dead bodies. Even today, our laws against the abuse of corpses often use the language of desecration. And yet western societies are making great efforts to transform death from a mystery into a medical procedure—a procedure that governs not just late-stage terminal illness but old age in general, depression, indeed any condition that can be presented as burdensome to the individual, the family, or even the state.

Human nature has been demolished, disenchanted, disembodied, and desecrated. The results are the cause of much of the moral chaos that characterizes contemporary Western societies. The Psalmist’s question “What is man?” was originally meant to express wonder at his undeserved status before God. In our mouths, it expresses our nothingness.

This brings us to the continuity between orthodox Christians and cultural Christians: a shared desire to respond to the chaos on the basis of a ­stable anthropology, a retrieval of what it means to be human. How can this be done? The question is difficult, because of at least two challenges, which I note here merely as matter for future discussion. First, there is the fact that, whatever its theoretical origins in nineteenth- and twentieth-century thought, as a practical matter the abolition of man has been accomplished by means of technological developments on which we all now depend. The concept of human nature has become negotiable because it seems inseparable from, and largely subject to, the technologies by which we relate to the world and to each other. Nor can we simply withdraw from this technological context. Modern-day anchorites might call us to do so, but it is worth remembering that Simon Stylites could stand at the top of his pole only because other, lesser mortals produced and supplied the food that kept him alive. We must find ways to recover human nature that do not present an unrealistic romanticism as normative for the majority of people.

Second, there is the fact that a lack of social consensus on the existence of God, let alone on religious dogma and practice, precludes consensus on any view of human nature grounded in the divine image. This lack of consensus is a problem, since the response to the desecration of human nature must be its consecration, and consecration must occur in a religious context. Given the secularity of our contemporary context, Christians must be modest about what we can achieve.

Nonetheless, some progress can be made on the first three elements of the anthropological crisis. The Christian distinction between natural and supernatural ends is helpful here. The two cannot be absolutely separated in Christian theology, but evidence suggests that on at least some natural ends, consensus between the religious and the nonreligious can be reached. The revival of interest in religion among intellectuals, even where it is pragmatic rather than dogmatic, witnesses to a shared intuition that our cultural problems arise from anthropological confusion. That fact should encourage us. It may not amount to a return to Christian civilization, if ever there truly was such a thing. But it may mark an era in which discussion of a new humanism can be pursued by both the religious and the nonreligious.

It is no surprise to Christians that attempts to deny human nature end up either in confusion or subject to a dialectical transformation into the opposite of what was intended. Those confusions and transformations are visible to many secular thinkers, too. Therefore, pointing out the failure of secular policies to deliver on their promises is useful in building a humanist alliance and in putting anti-humanists on the defensive. Such immanent critique is a way of making space for genuine dialogue and constructive policy formulation.

Transgender ideology is a good example. At its heart lies an obvious contradiction: It authorizes disembodiment in its denial of the relevance of sexed physiology to gender identity; yet it insists on the transformation of the body, if an individual is to be authentically who he or she really is. The body is simultaneously of no importance and of overwhelming importance. Further, allowing psychological states to determine identity risks incoherence. Why cannot a man be a wolf, for example, if he is convinced that that is what he is? Yet can a human being self-consciously be a wolf, when one attribute of wolfness is unconsciousness of one’s wolfish essence?

The trans issue also exacerbates a strange contradiction within the culture of death. In at least two cases in Canada, depressed individuals have been refused medically assisted deaths after having undergone gender transition surgery. The surgeries had left these individuals in physical and mental pain, but their requests for medically assisted death were refused. We thus note the contradiction generated by progressivism’s commitment both to trans ideology and assisted suicide, for to grant medically assisted death in these cases would be to acknowledge that gender transition does not always resolve gender dysphoria. It would seem that in our progressive Animal Farm, some causes of suffering are more equal than others.

The issue of biological men competing in women’s sports has gripped the public imagination, since its focus on fairness circumvents the issue that makes trans ideology plausible to so many: its foundation in psychologized selfhood and happiness. The sports issue thus offers the opportunity for highlighting the importance of embodiment. Which is more plausible—the prose of a Judith ­Butler, the libertarianism of the ACLU, or that picture of Riley Gaines standing on a podium beside a man posing as a woman? The case for a new humanism is there made incarnate.

The transgender issue is connected to IVF. President Trump’s actions regarding transgenderism are most welcome, but his promotion of IVF suggests that these policies are not driven by a coherent anthropology. The Trump administration is not wrestling with the broader question: What status should we grant biological limitations in an era of Promethean technology?  Disappointing as the inconsistency is, it offers a chance for serious discussion about why these policy decisions are inconsistent.

The sexual revolution is also ripe for critique. Its intention was to liberate, but it has ended up turning everyone into objects. Easy access to the pill was sold as good news for women, but men have gained, too, from the promiscuity it enabled. And, despite the claims of some feminists, pornography is bad news for women, with its exploitative labor practices and transformation of the sexual expectations of its users.

Much of this has recently been pointed out by Mary Harrington and Louise Perry, writers who use secular arguments and evidence. Their work protests both the disembodiment of human nature and its disenchantment, seeing in the sexual revolution a prime example of promises betrayed and humans dehumanized. Likewise, when Jonathan Haidt warns of the effects of social media on young people, he speaks not in religious terms, but from an understanding that human nature is not infinitely pliable. There is the work of David Berlinski, an avowedly secular thinker. There is support across traditional political divides for anti-pornography initiatives. Many parents are becoming skeptical of the role of screens and smartphones in the lives of children. Combine these developments with the renewed interest among intellectuals in Christianity and its cultural influence, and the moment may have arrived for a new humanism. We need not wait for consensus on religious premises before starting these discussions. We need only point to the internal contradictions and the catastrophic consequences of our modern anti-humanist ways.

None of this is to say that a new humanism will certainly emerge in this earthly city. We may not win the day, and one who puts on his armor should not boast as one who takes it off. But there are signs that the anti-humanism of our age is overreaching by pressing the dismantling, disenchantment, and disembodiment of human nature to extremes. ­Many are realizing that we can fight human nature for only so long. It remains to be seen whether we will self-destruct or a new consensus on what it means to be human will shape our political discourse, our social policies, and our communities. The struggle for our cultural and political future is not best understood as a struggle between right and left, conservative and progressive, but as one between humanists and anti-humanists. And given the lateness of the day, I submit that the hour for advocating a new humanism is upon us.



This essay was delivered as the 2025 D.C. Lecture.

The post Toward a New Humanism appeared first on First Things.

]]>
The Cambrian Implosion https://firstthings.com/the-cambrian-implosion/ Tue, 30 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=103182 A historical moment ago, it was too obvious for words, but: Life is a blessing. So to regret the Cambrian Explosion that took place some 500 million years ago,...

The post The Cambrian Implosion appeared first on First Things.

]]>
A historical moment ago, it was too obvious for words, but: Life is a blessing. So to regret the Cambrian Explosion that took place some 500 million years ago, when animals rapidly evolved—life endowed with eyes and speed, energized by a fillip of oxygen—would be insane. No less insane would it be to lament the emergence of the Cenozoic Era, when first primates and then humans made their way into a world that people would cultivate and sustain in forms conducive to life.

In our own time, a reversal of the blessing of life has begun: Call it the Cambrian Implosion. This implosion, foreseen with equanimity by nineteenth-century intellectual grotesques like Ernest Renan and championed by contemporary transhumanists, variously intimated and understood by Fyodor Dostoevsky, Henri Bergson, Charles Péguy, C. S. Lewis, Martin Heidegger, and Jacques Ellul—has begun to shape the whole of our lives. The particulars and paradoxes of this moment deserve our close attention. The contrivances and apparatus of the inorganic now assert themselves against the organic in almost every dimension of life.

Of course, we too are able affirm life, often through our capacity for ingenious technical interventions into organic nature. Technology can sometimes make the blind see; transplanted hearts and kidneys can sustain the blessing of life. In other areas, we appear to balance the claims of technology and life. Over the last twenty years, for example, the safety-enhancing technologies in our cars have improved, but the superabundance of technology in and around our cars has led to an increase in deaths from distracted driving. In this case and others like it, technology giveth and technology taketh away.

One can see a rather tortured logic making its way through decisions such as these. Francis Bacon’s declaration four centuries ago in his novelistic manifesto The New Atlantis has come into the open in our century: The exaltation of science and technology demands the “enlarging of the bounds of human empire, to the effecting of all things possible.” So we resoundingly affirm life when it is a tribute to “human empire” in organ transplants, and in making our cars safer. But when, in order to save lives, we must reduce “human empire” by limiting the use of smartphone technology and onboard screens, we struggle to do it. It is only in egregious cases such as human cloning that we are able firmly to limit technology in a way that does not use one technology against another, but allows the blessing of life to shape technology’s presence and power. It is against our century’s imperial project to accord human life authority over technology, in which we increasingly live and move and recognize our being.

Meanwhile, what is happening to life that precedes and remains independent of Bacon’s technological human empire? We know some of the answers, although they often come to us as informational disjecta membra, rather than as full and integral understanding.

Take animal nature. We rational animals increasingly live apart from the animal kingdom; nonetheless, many of us have noticed the accelerating recession of animals from our presence. There are far fewer bugs meeting their ends on our windshields every summer, and far fewer birds singing around us (in the U.S. and Canada, almost three billion fewer compared to fifty years ago). Many of us have noticed media reports of a worrisome drop in the population of local fauna, as bats no longer fly through the twilit air nor bees flit around the blossoms in our gardens. The global decline is extraordinary: Animal populations have dropped by more than 50 percent in the last fifty years.

What of human life? Even as we live in lands increasingly emptied of animals, human beings are suffering a decline, veering toward a global collapse in birthrates. Sub-replacement fertility is now the norm in the countries where the majority of the world’s population lives, and it has spread beyond Europe and East Asia to parts of North Africa and the Middle East, as well as Latin America. In some regions of East Asia and Europe, the total fertility rate hovers around or has dropped below 1.0; from Japan and Italy to Ukraine and South Korea, historically unprecedented depopulation looms over the decades and generations to come. What is happening in some lands is coming to nearly all of them: According to The Lancet, only six countries will have replacement-level fertility rates at the end of this century.

The extraordinary drop in fertility is complex. But some observers claim that the human fertility crash is a blessing. A decline in the human population may slow or reverse the decline in animal populations (though there is little sign of that yet). And it may permit adults to give the young our full attention so that they can thrive. There are compelling arguments against the latter claim, but let it stand. It thus becomes all the more striking that, by many straightforward measures, our collective well-being over the last few decades, especially among the young, has declined precipitously.

Global rates of obesity in childhood and adolescence—the most physically active time of life for most people—have quadrupled since 1990, according to the World Health Organization. That this sudden and steep decline in fitness might be remediable by giving young people a diabetes medication rather than by confronting its causes—bad food, lack of exercise, and isolated lives are a few—is dubious cause for celebration.

Our inner lives, which naturally participate in embodied life and the world of “external” nature in myriad ways, have also fallen into difficulties. Adolescent and adult diagnoses of clinical depression and other mental illnesses have rapidly increased over the last several decades. Even before the pandemic did its grim work, rates of depression among American adolescents had doubled over the course of the 2010s, and rates of major depression likewise doubled. Further, from 2006 to 2016, there was a 70-percent increase in the suicide rate among white children in America aged ten to seventeen years old. Among African-American children of the same age, the increase was 77 percent.

Every struggle with depression is unique. But as the Oxford biologist Kathy Willis writes in her book Good Nature, regular encounters with nature improve human thinking and states of mind, in urban surroundings no less than rural ones. Yet as the young struggle to be happy, marry, and sustain deep friendships, these serious problems do not enjoy salience in respectable opinion. Rather than the preservation and fulfillment of young lives, we seek to widen access to abortion and euthanasia, including for the mentally ill.

It was thanks to the Cambrian Explosion that animals began to move more rapidly and effectively; now, in developed countries, homo sapiens is more sedentary than ever. Britain has nourished generations of youth with stories of children adventuring through pastureland and forests, from Mother Goose to Harry Potter. But in the 2020s, the number of British children who play regularly outdoors has dropped to about one in four—a more than 50-percent decline from the late twentieth century.

As we sit under florescent lights and stare at screens, our musical relation to nature likewise dwindles. The disbanding of choirs (in ­churches and elsewhere) narrows our opportunities to ­create music that celebrates creation with our own voices, rather than passively receiving recorded music. Similarly, according to one survey (also in Britain), many young parents now struggle to ­remember the lyrics of simple, easy-to-sing songs redolent of human life in nature in order to sing them to their own children—songs such as “Row, Row, Row Your Boat” or “Mary Had a Little Lamb.”

The language of nature is replaced by the language of technology and its appurtenances: Even beginner’s dictionaries testify to the rise of the inorganic at the expense of life. To the dismay of novelists, in the 2000s and 2010s the printed Oxford Junior Dictionary removed natural words such as holly, acorn, blackberry, heather, crocus and clover, heron and otter, and replaced them with contemporary technological terms such as broadband, database, and chatroom. (Tellingly, the dictionary also removed sin and saint from the view of impressionable children.)

In our century, organic life—and with it the language and creative liveliness that perceives and articulates human beings’ affirmative relations to life—has retreated and diminished, while the presence and power of the inorganic inexorably expands.

All of this happened before the appearance of artificial intelligence. Yet this last technological implosion of the organic threatens much more, even if it never reaches its programmers’ exalted ambition of general artificial intelligence—indeed, even if it never exceeds the abilities it now has. This latest project of human empire strikes closer to our innermost life than have any of its predecessors: It strikes at our will, our capacity to discover and develop our own reasons and insights, to make our own arguments, and to create stories and works of art by human beings for human beings.

The initial calls to resist the advance of artificial intelligence are already spent. The computer programmer Geoffrey Hinton, the “Godfather of AI,” resigned from Google and warned us of extraordinary dangers for genuinely human futures. Soon afterward, more than a thousand Silicon Valley figures demanded a pause to the progress of AI research. For a moment, these extraordinary protests received dutiful attention in Western media. But the same media soon turned to what the next profitable or entertaining innovation in artificial intelligence might be—perhaps AI “friends,” including AI boyfriends and girlfriends, which are now (in a revealing contemporary turn of phrase) “a thing.”

By all accounts Sam Altman, head of Open AI, has a decidedly awkward relation to intellectual property, and yet he receives unfathomable sums of money and deferential treatment at pompous confabs sponsored by the New York Times. Eminent opinion seems not to care deeply or to act on warnings and controversies around artificial intelligence. Global competition requires that we race forward rather than reflect, and secular global techno-capitalism too often appears to its representatives as a hideously awesome god: Evils for its sake must be piously borne.

It is true that some of the claims for the sentience of artificial intelligence made by advocates—and some opponents—are implausible. It is not likely that artificial intelligence will become conscious as we are, or that it will become entirely free of hallucinatory fancies. Even so, it is already a “good enough” imitation of our language and our rules of work and play, and for that reason it diminishes our own occasions and acts of honest thinking, reasoning, and creativity.

Our status as homo ludens—as incorrigibly playful creatures—is in retreat, and at the highest levels. DeepMind’s Alpha-zero AI program now not only beats but dominates world-class players of chess, Go, and Shogi, something that would have shocked players only a decade ago. A few years ago, it impelled the retirement of Lee Sedol, the best Go player in the world, who was demoralized by the superior technological force deployed against him.

That is only the fate of one supremely gifted player, one might say, and after all it’s “only” a game. Yet in the years before us, artificial intelligence will enter not just into play, but into justice. Many expect that AI will generate not just routine legal tasks, but the most ambitious legal arguments. One prominent litigator, Adam Unikowsky, argues from experience that the more advanced forms of artificial intelligence are already writing plausible Supreme Court opinions and decisions.

As a final source of resistance, we can naturally turn to those most committed to humane learning—that is, teachers at every level, including college professors. Yet a growing number of them passively or actively accept the use of artificial intelligence for various tasks by their students. Other teachers use it themselves, submitting AI-generated reports, lesson plans, and comments on papers. This turn raises the increasingly likely prospect—it has almost certainly already occurred—that teachers and professors will command artificial intelligence to grade and comment on papers that were partly or entirely generated by artificial intelligence. In this latter case, John Henry Newman’s cor ad cor loquitur is superannuated: Now bot speaks to bot.

Even when teachers resist AI, readings are cut to make them fit the habits we have acquired from our constantly expanding “interface” (ah, the lyrical grace of the technosphere) with machines. The luminous flow of Dante, Shakespeare, Eliot, and Tolstoy must make way for short excerpts, summary articles, and bullet-pointed presentations, so that the young may more easily “process” the required “information”—measures, in short, to format our organic minds with digitally induced expectations and imperatives. Practices such as these contribute to “the reverse Flynn effect,” as intelligence testing finds that young people are becoming less intelligent in the twenty-first century, an ominous reversal of a longstanding ascent.

Of course, nature absent human ministrations and ingenuity can be cruel to human beings, and one reason we have struggled against the limits of given nature is that some of them were indeed cruel. The impulse to escape nature’s deadly caprices is a legitimate ­dimension—one legitimate dimension—of our own nature, and many modern technologies and treatments have helped us a great deal, to say nothing of exalted explorations of the real, from quantum mechanics to theoretical mathematics. But in our moment, we must acknowledge our veering into a new age in which this single dimension of our nature increasingly asserts itself as a usurpative power over all others.

In the now evanescent modern age, the heroic efforts to eradicate smallpox and polio, to bring music and sports and reading and writing to children all over the world, rich and poor alike—those and other historic initiatives were in service of life, specifically and uniquely human life. It was held to be good that there should be more life, and that human talents and gifts should be permitted the time and attention that the harsher dimensions of nature had denied to far too many people.

This earlier modern world might occasionally have dreamt of Bacon’s empire, but it remained a human world, with human goods. Even as inveterate a modern materialist as Marx, in his reflections about the utopian fulfillment of “species-being,” speculated about forms of quotidian happiness aspirationally legible to many people in strikingly different cultures and historical periods. In those days to come, mused upon in the famous passage in The German Ideology, one’s daily round would comprise hunting and fishing, learned criticism and raising cattle.

Given that there are far fewer fauna to find for raising, hunting, fishing, or any other purpose, that we are less physically fit, and that we read a great deal less than we did just twenty years ago, these human hopes seem far less plausible in the ersatz world . . . not aborning exactly, but being fitted around us, like the bear-suit wrapped around the doomed Christian in Ari Aster’s Midsommar. We have begun to summon technology as a replacement for, even a consumer of, our own lives.

Distracted ourselves, we find it difficult to focus on the sacrifice of the well-being of the young, on our own capacity for learning and creativity, on finding and keeping happy relationships, and on caring for the nature for which we serve as stewards. Step by step, it becomes easier to relinquish ourselves, our natural possibilities and responsibilities, for a planet-spanning techno-mechanical exoskeleton of artificial action, thought, and creation. That exoskeleton offers to relieve us of the burden of being active and intelligent creatures who live reasonably and creatively in nature with other people. It offers to make us spectators in an artificial world that uses nature, including our own natures, as its resource mine—and as its trash dump.

Perhaps I spoke hastily in saying that the artificial intelligence race will not be meaningfully constrained so that humans might remain free. Last Christmas Eve, the Times reported a limit placed on the power of AI, agreed upon by China and the United States. Though the negotiation was anything but easy—it reportedly “took months”—both governments agreed that, in the words of Thomas Friedman, “no decision to fire a nuclear weapon can be made by an A.I. bot alone.” Indeed, “there always has to be a human in the loop.”

This is good to hear. But the fact that, after laborious negotiation, the bilateral regulation of artificial intelligence between the world’s two greatest technological powers appeared to issue in little but denying a new technology the authority to destroy a great proportion of life on earth is revealing, et pas dans le bon sens, as the French say. Perhaps—and it may be a proposal too tiresomely earnest for the attention of our technocratic guardians—there might also be some effort to give humans authority over technologies that will protect the prerogatives of life?

It will be difficult. To realize Bacon’s ambition and make “all things possible” is to abolish all boundaries of given form and nature, including the form and nature of human beings. For nature to be subjected completely to technical and procedural artifices, everything must be potentially changeable into something other than itself, by a sovereign will that draws energy from human passions and desired states—pride, greed, fear, idleness, pleasure—and yet ultimately tends to become unmoored from anything extrinsic to itself. The human empire must conquer nature in toto, including the varied and innermost recesses of human nature—and hence subjugate and transform all that is organically given, in ourselves and the world.

The Cambrian Implosion will probably accelerate in the decades to come, with life, especially human life and all its possibilities, in retreat. Our technocratic-managerial guardians and their sophists (now known as “Comms Teams” and “PR People”) will call this desert peace. They will praise and enrich themselves from its course, then enrich themselves again by selling select mitigations of the consequences it visits upon life. They will celebrate technological advances that treat the diseases and disabilities of human life, while maintaining silence as what is given to us as organic life declines, withers, fails to generate more life, and disappears.

It is wise and right to have confidence that there will be good paths through our century, paths that make judicious use of new technologies, including artificial intelligence. But human life, and life as such, require that technology resume its role as an assistant to human beings in the stewardship and furtherance of life, the mutual and complex relation of general nature with all of our naturally human hopes, talents, and responsibilities.

There is a better, higher standard than the one being made explicit in our time. For whatever ­technique, procedure, or technology ­discourages and diminishes organic life and the gifts and blessings of human life—even if it is powerful and ­profitable and convenient and affords us opportunities for ease—is not for people at all, but for incipient ­cyborgs.

It is past time for the Cambrian Implosion to be seen truly, with all its ugly, consumptive power over the primordial blessing of organic life. This empire should not be allowed to advance under the banner of “Progress.” Its inversive power has a truer name.

The post The Cambrian Implosion appeared first on First Things.

]]>
Zombie Bioethics https://firstthings.com/zombie-bioethics/ Mon, 29 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=102986 A recent article in MIT Technology Review carries the strange ­title, “Ethically sourced ‘spare’ human bodies could revolutionize medicine.” Three Stanford biologists and ethicists argue for the use of...

The post Zombie Bioethics appeared first on First Things.

]]>
A recent article in MIT Technology Review carries the strange ­title, “Ethically sourced ‘spare’ human bodies could revolutionize medicine.” Three Stanford biologists and ethicists argue for the use of so-called bodyoids in science and medicine. This infelicitous term refers to hypothetical modified human bodies created from stem cells—bodies that have been genetically altered so that they lack brains, and thus, presumably, are without consciousness. The authors acknowledge that we do not yet have the technical capability to create such beings, but recent advances in stem cells, gene editing, and artificial uteruses “provide a pathway to producing living human bodies without the neural components that allow us to think, be aware, or feel pain.”

Strictly speaking, artificial uteruses are not necessary for the development of bodyoids. Such a reprogrammed embryo could ­theoretically be created in a lab and implanted in a woman’s uterus, as is done with IVF. But the notion that an entity regarded as subhuman should be born from a human mother seems too gruesome even for these bioethical pioneers to contemplate.

The authors admit that many will find the prospect of bodyoids disturbing, but they argue that a “potentially unlimited source” of “spare” human bodies will be immensely useful and should be pursued. We could, for example, harvest the organs of these ­presumably nonsentient humans and conduct experiments on them in order to test drugs and other medical interventions. The authors even suggest that it would be more ethical to do drug testing on humans who cannot feel pain, because they lack nervous systems, than on animals that can feel pain. There are other potential benefits for animal species as well, they aver, since we could use animal bodyoids to avoid causing pain and suffering in the cows and pigs we slaughter for food.

Human bodyoids are not entirely within the realm of science fiction. Scientists have recently produced “embryoids,” or “synthetic ­embryos,” from reprogrammed stem cells, without the use of sperm and eggs. Embryoids are living entities that seem to develop as human embryos do but that presumably lack the capacity for full human development. (We do not know for sure that they do, as they are typically destroyed after fourteen days, before the heart and brain have begun to develop.) Just as advocates for embryoids argue that their innovation allows us to avoid the ethical problems associated with embryo-destructive research, so advocates for bodyoids propose to provide us with “ethically sourced ‘spare’ human bodies.”

The Christian ethicist Oliver O’Donovan described “a position too familiar to technological society, that of having achieved something that we do not know how to describe responsibly.” In the case of bodyoids, I submit, advocates do not know how to describe them at all. One can hear them stumbling over their words and fumbling with descriptors.

Bodyoids are human bodies. Or rather, human-like bodies. But not human in any morally relevant sense—they lack brains, after all. But sufficiently human that we can harvest their organs for transplant and conduct experiments on them to see how “real” humans would respond to drugs. Indeed, they are of interest to scientists precisely because they are so, well, so very human. But not really. For the most part.

Well, then, what are human bodyoids?

Long before ethicists began to contemplate living—or at least, undead—human creatures who lack all brain function, such entities were explored in science fiction and horror films. The precise name for such a creature is zombie. The concept has roots in Haitian folklore, where the term is zonbi, referring to a person who has been brought back from the dead through magical means to serve as a mindless slave. The problem with creating zombies, our stories suggest, is that they always come back to bite us. Creating them ­diminishes our humanity.

Are not zombies precisely what advocates of bodyoids want to conjure into existence—a mindless slave, biologically and physiologically human in all relevant ways, that can nevertheless be experimented upon, harvested, and killed with impunity? Indeed, by our current definition of brain death, such an entity cannot actually be killed because it is already dead. In this, too, it resembles a zombie. One can easily imagine a B-movie horror feature titled Revenge of the Bodyoids.

The concept of brain death—­defined as total cessation of all brain function—arguably paved the way for advocates of the ­creation and exploitation of bodyoids. As the authors of the article point out, “Recently we have even begun ­using for experiments the ‘animated cadavers’ of people who have been declared legally dead, who have lost all brain function but whose other organs continue to function with mechanical assistance.” What are we to make of the term “animated cadaver,” which seems to express a manifest contradiction?

Advocates for the brain-death criterion argue that death is the disintegration of the unified organism, and the brain is responsible for maintaining organismic unity. Liberal bioethicists also argue that without consciousness, though there may be a living human being, there is no morally or legally relevant “personhood.” But these arguments do not withstand scrutiny. The brain modulates the coordinated activity of the other organs; it does not create that coordinated activity. That is accomplished by the organic formal unity of the body as a whole—which modern science, with its reductionistic analysis of the body into component parts, fails to discern.

Although a brain-dead patient has no functional electrical activity of the brain, the patient continues, with the help of machines, to breathe and to circulate blood. The organs continue functioning and remain fresh for transplant. The body of a brain-dead person on a ventilator maintains homeostasis and ­coordinated unity of functions: The kidneys make urine; the liver makes bile; the immune system fights off infections; wounds heal; hair and fingernails grow; endocrine organs secrete hormones; broken bones ­heal and broken skin repairs; children grow proportionately as they age. Pregnant mothers can even gestate babies after brain death, sometimes for months. Consider the contradictions and manifest absurdities in this headline: “Brain-dead Virginia woman dies after giving birth.”

To all appearances, a patient in this state is not, in fact, dead. Some medical ethicists have therefore—quite sensibly—questioned the validity of “brain death” as a criterion for death. The brain-death criterion was developed by a Harvard Medical School committee in 1968 to free up ICU beds and promote organ ­transplantation—with death itself forming the foundation of the organ-transplant enterprise. For organ transplantation rests upon a paradox, perhaps an outright contradiction: a “dead” donor whose body, with its precious organs, is still living.

After a person is pronounced brain dead, if the family refuses transplantation or if the organs are deemed unsuitable for transplant, the following situations emerge. Once the ventilator is turned off, the patient’s heart may continue beating for several minutes, or even a few hours (especially if the patient is a newborn). Surely we would not send such a “dead” patient to the morgue, cremate her, or bury her while the heart still beats. Should we then give a drug, like potassium chloride, to stop the heart of the supposedly already dead patient? In some cases, we wait a day or two to shut off the machines of a patient who is pronounced brain dead, to allow family to travel and be at the bedside when the ventilator is discontinued and, eventually, the heart stops. Will the family witness the death of the patient, or merely the cessation of efforts to animate an already dead corpse? If the latter, why would family members want to be present for that?

Considering these oddities and absurdities, which stem from the legal fiction that brain death is the death of the person, “total brain failure” is a more accurate term than “brain death.” It indicates an irreversible coma, not a dead body. Perhaps such a person is “better off dead,” as many people assume. Certainly, it is ethically justifiable in such a situation, where meaningful recovery of human functioning is impossible, to discontinue life-extending measures such as ventilators or antibiotics. Even so, such a person is not yet dead.

Indeed, advocates of bodyoids, which would similarly lack all brain function, do not argue that a bodyoid is dead—merely that it is not human. Bodyoids are of interest precisely because they are living and human in all scientifically relevant respects. To their credit, the Stanford authors do mention the following danger: “Perhaps the deepest [ethical] issue is that bodyoids might diminish the human status of real people who lack consciousness or sentience”—such as those in a coma or babies born without a cerebral cortex (a severely disabling condition known as anencephaly).

However, the authors go on to dismiss this concern. They argue that, like bodyoids, a sufficiently detailed mannequin would look much like us; that does not make it human. But nobody is proposing scientific experiments on mannequins, and for good reason. However realistic they might appear, they are not human, and thus, unlike a bodyoid, they have no value for ­science and medicine.

A bodyoid’s value for science and medicine lies precisely in what it would be, which is not a zombie, not a dead person, not a mannequin that mimics the human form. It would be a profoundly disabled human being, designed and created to be profoundly ­disabled—a ­vulnerable human being so ­totally defenseless and voiceless that it could be exploited with ­impunity.

If this is the case, we would endorse this macabre project only if we ourselves had become, so to speak, moral zombies.

The post Zombie Bioethics appeared first on First Things.

]]>
The Essential Newman https://firstthings.com/the-essential-newman/ Fri, 26 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=103212 I first read St. John Henry Newman in 1985, when I was a graduate student at Yale. One of my teachers, Hans Frei, assigned Newman’s Apologia Pro Vita Sua...

The post The Essential Newman appeared first on First Things.

]]>
I first read St. John Henry Newman in 1985, when I was a graduate student at Yale. One of my teachers, Hans Frei, assigned Newman’s Apologia Pro Vita Sua in his famous introductory course on modern theology. On its face, the book would seem an odd assignment. The subject matter is Newman’s spiritual journey to Rome, which is documented in minute detail, often with reference to remote figures and obscure controversies. 

But Frei knew what he was doing. Newman’s Apologia provides the testimony of a man who refused to accept the modern reframing of faith: that the latest academic and scientific and cultural authorities ought to have authority over God’s revelation; that men ought to be free to pick and choose among dogmas; that religion is a human phenomenon serving human needs; that it is wrong to ask men to believe what they do not understand; that inherited dogmas must answer to the need for “progress.” More than any other book assigned in Frei’s course, Newman’s framed with force and clarity the challenges we faced as young theologians.

 During class discussion, Frei was at his twinkle-eyed best as he asked us what we thought of the famous “Note on Liberalism.” As Newman explains, liberalism adheres to the “anti-dogmatic principle,” which entails the subordination of the things of God to the judgments, needs, and whims of men. Frei knew that, as young students of theology, we, too, were engaged in a struggle to overcome liberalism, seeking to serve the authority of God rather than becoming spokesmen for the authority of men.

As I’ve noted, this broad and fundamental theme is not immediately evident. The subject matter of the Apologia is Newman’s struggle to remain loyal to the Church of England, a struggle he ­eventually lost. Moreover, in the Apologia, his most detailed reasons for crossing the Tiber are almost always negative, turning on many painful realizations that he was deceiving himself in his arguments for Anglicanism. Although I would eventually follow the same path, I don’t believe that the Apologia’s importance is ­r­estricted to struggling Anglicans. Newman’s most important lessons are for all modern men, believers and unbelievers alike. 

It’s difficult to characterize Newman’s lasting insights. For the most part, Newman did not write about normal theological topics such as Christology or the doctrine of the Trinity. In almost every instance, Newman’s publications were occasional, which is to say, he wrote about the particular problems and difficulties of his time. For example, The Idea of a University is a polemic against the utilitarian mentality that was gaining power in the nineteenth century and has become entirely dominant in the twenty-first. Newman’s treatise on the development of doctrine answers questions raised by modern historical study, which documents changes so dramatic as to suggest discontinuity. His sermons on faith and reason and the Grammar of Assent address the bipolar disorder of the modern intellect, which swings back and forth between dogmatic rationalism and despairing skepticism. Nevertheless, I’ll take a stab at “the essential Newman.”

His central insight concerns the divine gift of concrete and substantive things, natural and supernatural. Their “weight” gives them staying power in the face of human pride and willfulness. And, furthermore, the “weight” of things, their plenitude of reality, casts a spell over us. For to be human is to seek to dwell in the light of what is real rather than prowling about in the shadowlands of ideas, theories, and syllogisms—“paper logic,” as Newman puts it.

Newman is rightly held up as a trustworthy guide to the difficult question of the relation of faith and reason. He understood the weakness of argument as compared to the power of things that have weight. “Deductions have no power of persuasion,” he writes in a famous passage:

The heart is commonly reached, not through reason, but through the imagination, by means of direct impressions, by the testimony of facts and events, by history, by description. Persons influence us, voices melt us, looks subdue us, deeds inflame us. Many a man will live and die upon a dogma: no man will be a martyr for a conclusion. A conclusion is but an opinion; it is not a thing which is. . . . No one, I say, will die for his own calculations; he dies for realities.

One of the powerful vacuums of our time is relativism, the notion that everything we believe arises from within—or, if one has taken a class in social theory, the doctrine that what we believe is “socially constructed.”  We cannot argue our way out of this dead end. Only the force of reality can deliver us.

In a similar way, in matters of faith, Newman asserts what can be called the dogmatic principle. Faith is not a sentiment or speculative enterprise. It believes something specific, something taught and commanded. To this Newman adds a second, ecclesiastical principle: There exists an institution that does not simply convey these teachings and commandments but embodies and enacts them, in short, “a visible Church, with sacraments and rites which are channels of invisible grace.” The Church is primordial, a given fact, a “concrete representative of things invisible.”

At the end of his Apologia, Newman meditates on the Catholic Church’s claim to teach without error. (It is this material that might rightly take a leading place in a specifically Catholic apologetics.) In the face of modern man’s confidence that he ­possesses, for the first time in history, the tools with which to winnow and improve, sift and update, this claim marks the height of religious intransigence, which is why Catholicism then (and still) strikes many as dangerously medieval and authoritarian. Newman celebrates the “prerogative of infallibility.” It is to be cherished for “smiting hard and throwing back the immense energy of the aggressive, capricious, untrustworthy ­intellect.” The Church’s power to teach with authority, her daunting weightiness, rescues reason from its “­suicidal ­excesses.” 

Modernity has encouraged us to build an empire of reason. The upshot is now plain to see. Our society, imbued by liberalism and its anti-dogmatic animus, is characterized today by soulless technocratic management of utilities and a dispirited skepticism about transcendent truth. As a Doctor of the Church, Newman prescribes for our postmodern illness a strong dose of “weightiness.” If reason is to flourish, it must feel the force of divine instruction. “It thrives and is joyous, with tough elastic strength, under the terrible blows of the divinely-fashioned weapon, and it is never so much itself as when it has lately been overthrown.” What holds for reason also obtains for true freedom—and, indeed, for the entire project of sustaining a culture that is congenial to human flourishing.


Image by Jacques Bihin, licensed via Creative Commons. Image cropped.

The post The Essential Newman appeared first on First Things.

]]>
Contained Controversy https://firstthings.com/contained-controversy/ Thu, 25 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=103358 To describe the shifting relationship between religion and art, we must use the broad brush. ­Judaism and Islam are ­aniconic (image-less) religions. Christianity, an iconic religion, is the odd one out...

The post Contained Controversy appeared first on First Things.

]]>
The Last Supper:
Art, Faith, Sex, and Controversy in the 1980s

by paul elie
farrar, straus and giroux, 496 pages, $33

To describe the shifting relationship between religion and art, we must use the broad brush. ­Judaism and Islam are ­aniconic (image-less) religions. Christianity, an iconic religion, is the odd one out. In the Torah, the Second Commandment prohibits “graven images,” and not just of people. Portraying anything that lives “in the heavens above . . . in the earth beneath . . . or in the water under the earth” is not kosher. ­Muhammad omitted instructions for artists from his otherwise comprehensive code for living, but the ninth-­century hadiths of Sahih al-Bukhari say on his behalf that “­image-makers” will be punished on the day of resurrection for claiming to “breathe a soul” into their creations.

Christianity absorbed the pagan veneration of human images. Christians have occasionally taken the Second Commandment at its word and smashed their images—notably the Byzantine Christians of the eighth century, whose pious vandalism gives us the word iconoclasm, and European Protestants in the Reformation, whose destruction anticipated the bare interiors of modernist design. The last major episode of Christian iconoclasm I can recall was in 1966. Frenzied hordes of young Americans, most of them Southern Protestants, burned albums by the Beatles, a group of baptized Anglo-Irish Catholics.


Everyone’s a critic. The disposition to perceive harmonious arrangement and expression is innate. So is the mimetic response, the urge to create coherent, communicative objects. These impulses keep returning, from the Golden Calf to the golden mean. The aniconic religions handle them by accommodating public demand while affirming the priority of words over images and religion over art. There are lions and vegetables in the mosaics of ­Hellenistic-era synagogues in Israel, full-length Byzantine-­style portraits on the walls of the synagogue at ­Dura ­Europos in Syria, and a mountain of portraits of Ottoman emperors and Persian wine parties, but the pictures, like the ­spicier imagery in the Song of Songs, are framed within the four cubits of religion. The same goes for photographs of the Lubavitcher ­rebbe Menachem Mendel Schneerson and Ayatollah Khomeini.

Christianity is ­different. Painting, sculpture, and ­theater became fully Christian art forms. Christianity also differed from Judaism and Islam in its concept of the earthly saeculum. As the arts began to work themselves loose from Christian supervision in early modern Europe, they became oriented toward secular patrons and emerging markets. These arrangements allowed the producers first to cultivate a secular audience, and then to declare themselves independent of any authority. A new social creature appeared. “The artist—as he comes to be called—ceases to be the craftsman or the performer, dependent upon the approval of the audience,” Lionel Trilling wrote in Sincerity and Authenticity. “His reference is to himself only, or to some transcendent power which—or who—has decreed his enterprise and alone is worthy to judge it.”

John Ruskin traced this figure’s inception to the era of Raphael and the celebrity artists of the later Italian Renaissance. Trilling dated his emergence to the Enlightenment, though this probably had less to do with philosophy than market forces and copyright laws. Either way, the decoupling of religion and art decoupled the sign from its signifier, and without always reuniting them in the social world. The watershed artwork might be ­Giorgione’s Tempesta (1506–08). It looks great, but no one knows what it’s about. Perhaps Giorgione didn’t know.

This uncertainty would not surprise Paul Elie. The Last Supper describes itself as an attempt to examine “crypto-religious” art in the 1980s. The art is mostly paintings, songs, and movies, though novels and poetry appear here and there. The focus is on New York City, which was the capital of the art and publishing worlds, a key stage for popular music and gay life, and hence a battleground in the culture war between gay artists and the Catholic hierarchy in the decade of AIDS. The task of the rearguard action fell to John Cardinal O’Connor.

Elie’s “crypto-religious” artists are mostly of Catholic extraction. Most of the works that interest him carry Catholic connotations, autobiographical and theological. Some are ­explicit—for instance, Andy ­Warhol’s “Last Supper” series (1984–86), in which Warhol reprinted Leonardo da Vinci’s painting; Martin Scorsese’s 1988 adaptation of Nikos Kazantzakis’s 1955 novel The Last Temptation of Christ; or the MTV-powered rise of Madonna to what the language of publicity calls “iconic status.” Others, such as ­Robert Mapplethorpe’s gay S&M photos, were by Catholics with a grudge.

An icon is a symbol standing for the real thing, intangible though it may be. The semiotic distance is narrow in a church—and narrowest in Eastern churches, where icons are venerated as possessing in a mysterious way the sacred realities that they represent. But the semiotic distance is widened by the marketplace. Many of Elie’s artists and works descend, like the imagery in a Madonna video, from organized religion, but their adult personalities and works are products of disorganized religion, the subjectivity that Bruce Springsteen, with typical articulacy, called “that searchin’ thing” that’s “like, religiously based, in a funny kind of way. Not like orthodox ­r­eligion, but it’s about basic things, you know?”

We do. Madonna released her debut single, “Everybody,” in 1982, exactly a century after Nietzsche detected the death of God. The long contraction of institutional religious authority opened the semiotic field for the spiritual countermoves of the “Me Decade” of the 1970s: evangelical revival and post-Christian searching, therapeutic idealism and cultish disorder. In the 1980s, the semiotics of icon-making scrambled into a heap of gleaming images. We got “searchin’ things” such as the band U2, another act whose endurance and popularity suggest civilizational senility.

U2’s guitars and drums echo through vast halls of reverb and delay. The lyrics are studiously vague passion plays, delivered with implausible seriousness. Like the theatricals of Madonna, who, Elie claims, “made the sports arenas of Europe her pulpit,” U2’s music makes the audience “a giant congregation, drawing together the energy of revival, rock show, football match, and political rally.”

None of these acts was “crypto-­religious.” Their religiosity was explicit. Elie takes his hermeneutic from the Polish-born poet Czeslaw Miłosz. “I have always been crypto-­religious and in a conflict with the political aspects of Polish Catholicism,” Miłosz wrote to the Trappist monk Thomas Merton in 1951. In Miłosz’s poem “1945,” the artist is the “descendant of ardent prayers, of gilded sculptures and miracles.” He stands “alone with my Jesus Mary” against the “illusions” of modernity and its “flat, unredeemed earth.”

Crypto-religious artists of the 1980s, Elie writes, had to “create new patterns of reverence.” The “golden hue” of ­Andres Serrano’s “Piss Christ” (1987) is suffused with “a touch of the apocalypse,” and the tilt of the crucifix suggests “windswept, earthshaking drama.” This legacy of “Serrano’s Catholic experience” in the Brooklyn churches of his boyhood was received ecstatically in downtown Manhattan at a time when the scene was “ravaged by a disease transmitted through bodily fluids” and the city “riven by religious strife over the body and its claims.”

Serrano was embraced by the critical crowd and praised as a provocateur. So was Robert ­Mapplethorpe. They did not stand alone, as Miłosz had in the interregnum between fascism and communism. Nor did Warhol, Madonna, and U2. The big names of the eighties were illusionists of modernity, flatteners of transcendence, bringing heaven down to earth and selling it wholesale. They dealt in multiples: prints, photographs, vinyl, CDs, radio mixes and remixes. They capitalized on a business model whose rules of outrage were clarified in the 1960s. That business attained peak commercial coherence in the years that Elie studies: between the advent of MTV in 1981 and the breakout of the commercial internet in the mid-1990s.

There was nothing “crypto” in the rejection of Church authority by Catholic artists such as Serrano or Sinéad O’Connor, or in the Catholic motifs in the films of Martin Scorsese. It doesn’t get more literal than squirting vials of urine onto a crucifix or making a heretical movie about the life of Jesus—or than the sex-and-religion psychodrama of the promiscuously Protestant Prince, whose music, Elie writes, “is a reminder of how much like music religion is.”

Not much cryptography is required to decode Catholics of the “searchin’ school” either. It was in the 1980s that Bruce Springsteen succumbed to the last temptation of celebrity and expanded his inter-song banter from its 1970s mode (mumbling in the manner of De Niro’s Johnny Boy character in Mean Streets) into extended secular sermonizin’ about the kind of banal and basic things that U2 was also shopping around the stadia. Even Bob Dylan, the most cryptic lyricist of the age, returned to literalism for the first time since 1964 on the three albums that followed his conversion to evangelical Christianity in late 1978 or early 1979.

The literalism continued when Dylan switched back and made the first overtly Jewish identification of his career. The artwork for 1984’s Infidels showed Dylan wearing kippah, tallit, and tefillin at the Western Wall for his son Jesse’s bar mitzvah. The first single, “Neighborhood Bully,” was a defense of the State of Israel, and as straightforward as an allegory can be. Elie believes that its lyrics “faulted the foreign-policy aims of Israel and the Christian right.” That is literally nonsense.

Elie is more interested in biography and culture-war iconography than in the quality of the artworks or whether the artist really is autonomous in Trilling’s sense. Not since Wagner packed them in at Bayreuth had sex and death been conjoined so explicitly. So why was little art of enduring merit produced in the 1980s? None of Elie’s downtown favorites amount to much in the critical rearview. Warhol had not produced anything of interest in decades. Wim Wenders was boring. Keith Haring was untalented. Sinéad O’Connor and Patti Smith were limited singers and second-rank songwriters whose biggest hits were covers. The William Burroughs chorus acclaimed Jim Carroll, but he came and went without producing anything of note. Robert Mapplethorpe was Tom of Finland for snobs. And Tunnel of Love was rubbish.

In America, MiŁosz detected “an indifference to basic values,” especially on campus, and the “subversion of the ethic of the working class, which was God, my country, my family.” The causes, Miłosz said, could be traced back “infinitely,” but the effects reflected a “very powerful transformation as far as religious imagination is concerned.” Civic piety was declining because the human image was being stripped of its “religious dimension.” The growing “difficulty of translating religion into tangible images” was also a transformation within religion and part of a fundamental social shift.

In the 1980s, you could buy albums on cassette, vinyl, or CD. It was a decade in which the social codes overlapped, too. Elie calls it the end of one era and the start of another. American society was becoming less homogeneous and more secular, but it was still possible to draw crowds by staging an old-style Kulturkampf and provoking John Cardinal O’Connor to complain about blasphemy. Disordered religion was established as the new church of the liberal upper-middle class, but the conservative lower-middle class turned evangelical and Reaganite. That spiritual reordering of old antagonisms announced a new era of intensifying hostilities—our era.

The outgoing age, Elie argues, was the world of Andres Serrano’s childhood, traditional and communal. The incoming age was “­postsecular”: atomized, identitarian, and therefore increasingly religious. The characteristics are recognizable, though it is worth adding that we might now be in a period when the searchin’ and findin’ is leading to reaffiliatin’ among the young. And we should remember that, as Elie argues, the outgoing secular age was itself susceptible to religious voices, whether cryptically in the redemptive passion of “progressive social revolutions” or explicitly in the political rhetoric of Abraham Joshua Heschel, Martin Luther King Jr., and the Berrigan brothers.

The publication of ­Salman Rushdie’s The Satanic Verses in 1988 was the last significant literary event in America, and not because it was a great book. Elie argues that “the novel’s central challenge was a crypto-religious one,” but there was nothing “crypto” or even cryptic there. Rushdie was a secular Third Worldist, so he never made a Miłosz-style argument for rescuing a hypothetical spirit of Islam from Islamic institutions. In life and fiction, Rushdie made clear and calculated arguments for art’s rights over religion. It soon became clear that he had miscalculated.

Blasphemy was back. Riots, murders, and death threats ensued. Islam’s aniconic tradition imposed itself by force. A new age of literalism began. The liberals split. Most of the older writers defended Rushdie’s right to free expression on artistic grounds. Norman Mailer called the fatwa the “largest hit contract in history.” The lion and the lamb shared the spotlight at a pro-Rushdie PEN rally when Leon Wieseltier (Jewish, pro-Israel) joined with Edward Said (Christian, not big on Israel, but also not, as Elie thinks, an adviser to the “Palestine Liberation Authority”) to support Rushdie, an Indian-born Briton who was what some would now call “post-Muslim,” with words that—encouraging as Rushdie may have found them while he became the underground man of the age—meant nothing.

Others sided with the complainants. Roald Dahl called Rushdie a “dangerous opportunist.” A similar prickliness about presumptuous immigrants rocking the boat and the literary hierarchy can be ­detected in John le Carré’s response: “There is no law in life or nature that says great religions may be insulted with impunity.” Jimmy Carter, then in the prime of his fatuous ignorance, accused Rushdie of “defaming” the Qur’an and “vilifying” Muhammad. John Cardinal O’Connor deplored threats of terrorism but called ­Rushdie’s novel “insulting and insensitive.” Elie, for whom O’Connor is the villain of the piece, calls this “the higher ignorance.”

The battlelines of the culture war shifted into their current alignment, with America now one front in a global conflict. When Elie argues that “Islam in American public life is still typed as peripheral and cryptic—as gay life was a decade earlier,” you get the impression that he is trying to limit the controversy over what can and can’t be said to the local and manageable. It isn’t. It wasn’t in the decade of Live Aid, when the pop stars of America sang “We Are the World” in a humanitarian foretaste of 1990s triumphalism. Lou Reed and Leonard Cohen both saw this at the time, but Elie skates past them.

Reed appears in the crowd scenes of The Last Supper, including on the crowded stages of the tedious caravan of post–Live Aid charity concerts. Elie never considers him in any detail, though Reed’s 1989 album New York touched on most of Elie’s motifs: Warhol world, AIDS, gay life, the unraveling of American society (“Last Great American Whale”), the withdrawal of religion from public life (“Busload of Faith”), and even the scandal of Pope John Paul II’s receiving the newly elected Austrian president Kurt ­Waldheim, who really had been cryptic about his wartime service in the ­Wehrmacht and his role in deporting Greek Jews to Auschwitz (“Good Evening, Mr. Waldheim”).

Meanwhile, Leonard Cohen’s “The Future” (1992), a real crypto masterpiece in the Miłosz sense, is barely mentioned. The band plays a 1970s groove reminiscent of J. J. Cale. It is oddly jaunty in the way of an empty fairground ride, but the musicians sound exhausted. The cultural revolution of the 1960s is played out. The future is chaos. The “breaking of the ancient Western code” will produce a new kind of aniconic blindness, in morals and semiotic meanings. “I have seen the future, baby,” Cohen sings, “and it’s murder”:

Things are going to slide in all directions
Won’t be nothing you can measure anymore
The blizzard of the world has crossed the threshold
It’s overturned the order of the soul

“Repent, repent,” the gospel chorus chirps.

“When they said repent, I wondered what they meant,” ­Cohen replies. He might have had ­Mapplethorpe in mind when he wrote lines such as “And now the wheels of Heaven stop, you feel the devil’s riding crop,” and “Give me crack and anal sex.” Cohen certainly saw what the 1980s really were: the last, lazy decade of old certainties, of the Berlin Wall and “Stalin and St. Paul,” of a box office thriving on a theater of rebellion in which all the pistols fired blanks. This nostalgia for contained controversy shapes Paul Elie’s reconstruction. It is not wholly true to its time, and it is false to ours.


Image by Nan Palmero, license via Creative Commons. Image cropped.

The post Contained Controversy appeared first on First Things.

]]>
Politics for Losers  https://firstthings.com/politics-for-losers/ Wed, 24 Sep 2025 13:00:00 +0000 https://firstthings.com/?p=103330 In a 2002 essay, Christopher Caldwell—perhaps the premier conservative journalist intellectual writing today—paid a memorable compliment to the Marxist ­theorist Marshall Berman...

The post Politics for Losers  appeared first on First Things.

]]>
Why Christians Should be Leftists
by phil christman
eerdmans, 229 pages, $23.99

In a 2002 essay, Christopher Caldwell—perhaps the premier conservative journalist intellectual writing today—paid a memorable compliment to the Marxist ­theorist Marshall Berman. Berman’s “­primer” on The Communist Manifesto, Caldwell remarked, “gives the first convincing account I’ve ever read of how an early reading of Marx could (and perhaps should) inflect a man’s thinking on everything for the rest of his days. Since no description can do it justice, let me just say I’m glad I didn’t read Berman’s essay at fifteen. They might have had me.”

This last sentiment came to mind as I read Phil Christman’s Why Christians Should Be Leftists. The book finds me approaching midlife, already a fusty academic and reflexive conservative, but once upon a time I was a precocious Christian teenager reading Bonhoeffer and Hauerwas, listening to Rage Against the ­Machine, and proclaiming to friends and family the madness of the Iraq War. Christman’s unique cocktail of palpable passion, self-deprecating piety, witty erudition, and unironic love for Jesus would have proved irresistible. They would have had me for sure.­­

Christman teaches writing at the University of Michigan and is a prolific essayist, both for outlets you’d expect (Jacobin, The Baffler, The Nation) and those you might not (The Bulwark, the University Bookman, the New Atlantis). Becca Rothfeld, reviewing one of his books for the Times Literary Supplement three years ago, dubbed him “one of the best essayists in America.” She’s right.


Why Christians Should Be Leftists is a personal tract—a “testimony,” in Christman’s words—explaining how an awkward, working-class kid from a fundamentalist home in the Midwest became a socialist without losing his faith. Or better to say: It narrates why his Godward faith led him leftward. Like every testimony, the book is an argument wrapped in an altar call. Christman lays out the reasons why you, too, for the sake of the gospel, should join him on the left.

According to Christman, it’s an opportune (not to say providential) moment for such a witness. The last decade has unmoored longtime party memberships and political identities, in both directions. The book is an exercise in outreach: to disaffected conservatives and Never Trumpers, to ­ex-vangelicals and mainline liberals, to wine moms and the newly woke on race, class, and gender. It says to them: You’re right to smell a rat. Its name is capitalism. Jesus shows us a better way.

Christman sets himself three tasks. First, he has to convince “normie” readers that politics is neither a bad word nor best kept separate from faith. As he puts it, “‘politics’ is just ‘morality as practiced by more than one person’”—in other words, “the search for the best way or ways to live together.” This search is ­unavoidable, and when we manage to do it well, “to reason and argue ­together, without sophistry, ­secrecy, or force,” the upshot—however imperfect—is a common life with an outside chance of being better than it would have been otherwise.

If Christman’s Christian leftism sounds deflationary, that’s because it is. He’s neither utopian nor revolutionary. But he does think the lives of the poor can be improved. And he wants Christian readers to understand that, given the structural forces that maintain the status quo, politics is the primary mechanism of that improvement.

Christman’s second task is to persuade left-curious readers that capitalism as such is a problem, that classical liberalism is partly true but insufficient on its own, and therefore that the Democratic Party doesn’t go far enough. Though he wouldn’t mind seeing former Republicans voting Democratic, his real aim is to swell the ranks of the party’s left flank. The kind of policy wins that Christman wants will come about only if Democrats feel electoral gravity pulling them leftward. Granted, in actual American elections leftists should almost always vote Democratic, but because “even the Democrats are always, in practice, in some amount of collusion with capitalism,” neither leftists nor Christians should align with them except with “a certain amount of critical distance, distrust, and irony.”

The case Christman makes against capitalism is partly empirical, partly moral. The empirical argument is simple: Capitalism immiserates at the structural level; alternative structures, whether ameliorative or substitutionary, could relieve the plight of millions; and the primary force preserving the status quo is the wealthy few who stand to lose their riches in a new social arrangement. Christman defines capitalism as “the right to property run amuck.” More broadly, it is

the social system in which the means of production—the stuff that makes all our stuff, which includes land, equipment, and also intellectual property, such as patents and the like, and also the stock that gives you a controlling interest in these things—is allowed to belong to individual people, who have a legal right to pass it on to their children, sell it to other individual people, or whatever else they might take a mind to do with it. That’s it. That’s all I mean. It’s not something Christians have to make our peace with because of our fallen nature, any more than feudalism was, or rule by gangsters, or the culling of the left-handed. We can and should severely check it and restrain it, or get rid of it entirely.

To get rid of it “would mean either heavy taxes on, or common ownership of, the sort of property that produces wealth”—that is, “the stuff that gives you a lot of de facto power over other people.” It wouldn’t mean, for Christman at least, the abolition of private property per se, or the imposition of de-growth austerity (“a sort of global Cuba”), or the dissolution of institutions into hippie communes, or purging the world of innovation and nice things. You can keep your toothbrush in paradise. Electric rails and neighborhood swimming pools await as well.

The empirical argument may be old hat for some of ­Christman’s comrades, but it isn’t the heart of his case. The chief inspiration for his brief against capitalism is instead the Sermon on the Mount. In five concise chapters, Christman both stabilizes the theological foundations of his argument and clarifies his own politics. For Christman, the gospel cannot serve as a mere adjunct to ideology—a spiritual handmaiden to a material politics that always already knows what’s best. More than a ­partner, the gospel leads the way and tweaks the ideology as a matter of course.

In this section, Christman tackles his third task: unfolding the logic of Jesus’s teaching. He offers compact theological ­sketches of created reality, human nature, human labor, human community, and the scope and ethics of love for neighbor. Christman’s writing is at its most beautiful and affecting here, as when he describes our efforts to know the truth as one great “organic mass of error and insight,” followed by this sentence: “But happily, if Christianity is true, then ­Jesus, when he wants to, directs the formation of this mass, and keeps it from becoming too damnably big of a muddle.” Even a Calvinist would have trouble improving on this plain-spoken description of providence.

For Christman, the Beatitudes encapsulate and commend the key to God’s Kingdom, what you might call a metaphysics of solidarity. “We live in a moral universe,” he writes, and what he means is that Jesus isn’t the crazy one­—we are. By nature, human beings are not “selfish survival machines” or utility maximizers. We are creatures of the God who is love, made in his image yet ruined by sin and folly. Christman doesn’t rely on natural law, but like a good Thomist he subjects civil law to judgment by divine law. In this case, the law of God is the teaching of Christ in the Gospels—the law of the land, so to speak, in the coming new creation. True, human depravity cannot realize this law in the here and now; our fallen politics can only approximate it. This admission must not, however, be used to foreclose approximation altogether. Seek ye first the Kingdom of God, and his righteousness; and all these things shall be added unto you.

I am tempted to call ­Christman’s vision left-integralism. That wouldn’t be entirely fair, but it gives you a sense of how unafraid he is to infuse earthly politics with the politics of Jesus. It also underscores how philosophically unsatisfying he finds modern metaphysics and its attendant anthropologies. In the end, he argues, economics without ethics is an illusion. We are always already legislating morality. We might as well hash out the morality out loud, together, instead of outsourcing it to bureaucrats in Washington or consultants on Wall Street.

Jesus’s economy of grace names its enemies even as it loves them. At the systemic level, its principal opponent is an ontology of winners and losers, according to which each group is destined to receive exactly what it merits. This, Christman believes, is the general run of things in a fallen world—hierarchy, domination, oppression, plunder—but capitalism gives it a particularly insidious twist, rationalizing the results as the just deserts of a benign god: the invisible hand of the free market. As Job heard from his friends, you get what you deserve.

Such a view rests on “the idea that misfortune is personally discrediting, ugly, embarrassing, and rightly to be shunned,” and it is therefore “a direct contradiction of everything Christ taught by word, and even more by deed.” Finding this “fundamental worldview” the one constant in our current president’s beliefs, Christman concludes: “Trump hates losers. In the incarnation, God broke metaphysics in order to become one.”

It is this simultaneous universality and solidarity—call it Christ’s moral catholicity—that defines the Kingdom, for Christman, and underwrites his brand of leftist politics. The parable of the vineyard upends every expectation we might have had about God’s justice. ­Wages in exchange for labor become a gift in return for nothing at all. The ­wages of sin are death, but the gift of God is eternal life. And if we are meant to be Christlike, which is another word for Godlike, and if with the good Samaritan we are “to regard anybody, everybody as our neighbor,” then there are no limits on either whom we are to love or the depths of our giving.

In a word, we cannot serve both God and Mammon. And if we want our society to honor the former, we must cast down the latter from its throne. In Christman’s hands, therefore, leftism means the rejection of idols in political economy. As St. Paul said, greed is idolatry. Those who say that greed is good merely unveil the object of their worship. We bribe the gods of the market in return for security, status, fame, pleasure. But like every idol, these gods cannot make good on their promises in the long term. Thus, to give our money away—whether by handing cash to a man on the street or through massive government redistribution—­becomes an act of spiritual warfare. As Jacque Ellul once remarked, giving “­desacralizes” money and thereby profanes it. It is that rare action that will never fail to achieve its goal, which is to infuriate the devil.

Late in the book Christman writes that, if readers have agreed with him on certain basic claims—that we live in a moral universe, that kings are not good, that neighborliness is universal and work is part of our common calling by God—and “admitted their economic implications,” then they are “already somewhere on the broad political left.” I’m not so sure.

One way of putting the problem is to ask whether one can be a socially conservative social democrat. ­Christman hates the police, supports open borders and gay marriage, and is ambivalent on abortion (“I can’t really take seriously the idea that it’s murder, or that it’s the state’s business, when a woman gets rid of a new blastula, or even when she aborts at ten weeks”). What about a Christian who accepts much or most of Christman’s economics but stands on the opposite side of these issues? Is he a conservative socialist, a socialist conservative, neither, or both? The previous three popes might fit this description, but at most one of them could be yoked to the left. Yet it was not Francis but Benedict XVI who wrote that, “[i]n many respects, democratic socialism was and is close to Catholic social doctrine and has in any case made a remarkable contribution to the formation of a social consciousness.”

Christopher Caldwell, in the essay I quoted earlier, offered a “working definition” of right and left: “Rightism is the belief that liberté, leftism that égalité, forms the basis for fraternité most in harmony with human nature. To the extent that the left believes in human nature.” Christman devotes his book to equality, understood as the leveling of economic fortunes, which in turn presumes and generates fraternity, understood as solidarity (since he does believe in human nature). Yet Christman says next to nothing about liberty besides pouring scorn on the ­unfettered market. This is a mistake, for at least two reasons.

On one hand, it leaves intact the pretense that there are no tradeoffs between freedom and equality. At a theoretical level, Christman knows full well that there are substantive questions to be raised about which freedoms are inalienable, irrespective of whether their loss would mean greater economic justice. The obvious example is private property, including the presumptive right to do with it lawfully as one pleases. Another is family life. The government needs very good reasons to insert itself into a household’s affairs, regardless of the projected impact on the overall equity of ­society. To put it baldly, mothers and fathers are allowed to be subpar parents.

But even more than this, Christman knows that, at a practical level, Americans of all stripes take their civil liberties so much for granted that very few would part with them voluntarily—or even at the point of a gun. Leaving these instincts and tradeoffs unaddressed is a missed opportunity.

Though it would be simplistic to say that individual freedom and state power are merely inversely related, leftist economics necessarily entails the enlargement of the latter at the expense of the former. That is, a transfer from one to the other. Any honest socialist will admit this. Nor will anyone except the most extreme libertarian deny that some balance, some set of tradeoffs between the two, is ­unavoidable if society is to function. It is regrettable, then, that Christman fails to take seriously believers’ honest concerns about a state sufficiently powerful and legally entitled to plan a socialist economy. It is not only the billionaires who fear such a thing.

Speaking of ­billionaires, a recurring feature of ­Christman’s account is his proclivity for identifying villains. He’s often very funny when doing so, as when he imagines Henry Kissinger “finally received by God” following eons of penitence, but the habit consistently undercuts his argument. Too often he blames individual capitalists, when he should be talking about supra-personal structures. He writes, for instance, that “whether there will be a job worth having in walking distance of your neighborhood,” or “whether your child is going to school on top of a toxic dump,” or which “two options we’ll pick from when we choose the next ­president,” is all “more or less decided by the people who hold all [the] ­capital.”

At other times, he talks about macro-manipulation when a democratic accounting of ordinary voters’ fears, frustrations, and preferences would shed more light. For example, Christman sees the neoliberal turn in the 1970s and 1980s as effectively a conspiracy theory from above, as though stagflation, riots, political violence, urban disarray, and skyrocketing divorce rates had nothing to do with it. To recognize these factors is not to suggest that the free market can solve them; it is to avoid robbing voters of agency and reasons of their own.

Has Christman never chanced upon bona fide libertarians in the wild? I have. They’ve done their homework. And their honest conclusion is that, on average, government regulation of the market does more harm than good. They may be wrong, but they’re not stupid, and nobody’s manipulating them.

Fortunately, Christman’s temptation to villainization is mitigated by his own lacerating self-criticism (the man believes in original sin!), and by his refusal of ­left-Schmittian—in plain terms, Marxist—sorting of revolutionary friends and bourgeois enemies. Even capitalists can be saved; after all, as Jesus said about the rich entering the kingdom, with God all things are possible. This commitment to charity fails Christman, however, when he turns to immigration. For two or three pages the book becomes an online screed about racist nativists, and Christman cannot muster the imagination to consider why anyone of goodwill, much less a Christian, would think borders and citizenship meaningful, worthy, or nonfictional political notions. Another missed opportunity.

The final oversight is conservatism itself. Reading Why Christians Should Be Leftists, one gains no earthly idea why any serious Christian would be anything else. There is indeed a romance to Marxism, but so is there a romance to conservatism. Consider ­Roger Scruton: “politics on the left is politics with a goal,” whereas conservatism “is a politics of custom, compromise, and settled indecision. For the ­conservative, political association should be seen in the same way as friendship: it has no overriding purpose, but ­changes from day to day, in accordance with the unforeseeable logic of a conversation.” In short, conservatism “means the maintenance of the social ecology.”

That’s the positive pitch. The negative is that (again in Scruton’s words) “it is not an accident that the triumph of leftist ways has led so often to totalitarian government.” Christman doesn’t countenance Maoist cosplay—those irritable online gestures feigning revolutionary violence—but more than a few secular socialists admit that the ends justify the means. The otherwise laudable essayist George Scialabba once cast his lot “with Lenin, Trotsky, Koltzov, and Cockburn that truth is whatever serves the revolution.” Yes, he admits, they were wrong in practice, but they “were right in principle: the liberation of humanity is certainly worth lying and murdering for, if these can be shown (though I doubt they can) to be the best way of achieving it.” For every convert to leftism thrilled by such language there are a hundred lifetime defections from it, and not without ­reason.

Perhaps Christman wants left activism to be leavened by believers who see God’s image in the face of the oppressor. The danger is that Christians will end up working and marching alongside people who see otherwise and act accordingly.

In current Western political discourse,” wrote the late Robert Jenson in these pages in 2014, “the most preposterous warranting narratives are immune to all refutation, and their devotees cling to them with the certainty of unacknowledged desperation. Two examples: the economic narrative of the Republican base and the bioethical narrative, which is the Democratic equivalent.” This is a memorable testament to the topsy-turvy nature of Christian politics. Believers can be found everywhere on the spectrum, and ever was it thus.

Phil Christman makes a forceful and eloquent case that Americans who bear the name of Christ need to reconsider the economy. It is ­undeniable that the gospel stands in judgment over our treatment of the poor, the vulnerable, the weak, the outcast—over our system as a whole. How could it not? The question is what to do about it. Christman may not have got me as a teen, but he’s got my attention now. Whether or not Christians should be, it’s clear that they can be leftists. And whatever politics we adopt, he’s right that it had better give priority to losers. One of the oddities of the present moment is that, after a long and tumultuous affair with the left, society’s losers have been shifting rightward. Populism cuts both ways; or, perhaps better put, it runs in either direction. Its irreducibly Christian character is precisely its infatuation with common people. Love for losers began with Jesus, after all, and continued in the Church. If any of us has a chance of entering the kingdom of heaven, it will only be through ­following them.


Image by Soman, licensed via Creative Commons. Image cropped.

The post Politics for Losers  appeared first on First Things.

]]>
Finding Private Roy  https://firstthings.com/finding-private-roy/ Tue, 23 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=103027 By the late 1970s, when I attended public high school in rural, blue-collar Central New York, more and more teenagers were living with a divorced parent and a stepparent—meaning...

The post Finding Private Roy  appeared first on First Things.

]]>
By the late 1970s, when I attended public high school in rural, blue-collar Central New York, more and more teenagers were living with a divorced parent and a stepparent—meaning, since mothers were almost always granted custody, with stepfathers. Their stories tended to erode the sugar-coating about blended families found in The Brady Bunch and other confections. Some of these stepfathers were awful. A few were monsters. Long before sociology taught us about the importance of intact homes for children, the battered lives of some of my friends amounted to a Q.E.D. all its own.

Yet statistics don’t reveal every truth. My own parents divorced soon after I was born. My mother remarried when I was five, which means that for most of my childhood and adolescence I had a stepfather, too. But he was as far from being an ogre as a man can be.

This stepfather was a man of astonishing good humor and steadiness, despite a life punctuated by many trials. He grew up in poverty on a primitive family farm. He lost his first wife to cancer early on. Decades of manual labor tested his endurance. But there was another test, the most formidable of all. He survived what 12,513 of his fellow American fighters did not: the eighty-two-day Battle of Okinawa during World War II, whose eightieth anniversary, like that of the end of World War II itself, is upon us this year.

The bloodiest battle of the Pacific theater, as well as the largest amphibious assault ever launched in that ocean and one of the largest of all time, Operation Iceberg began on April 1, 1945—which was also, that year, Easter Sunday. Lasting eighty-two days, it involved four U.S. Army and two Marine Corps divisions, a fleet of 1,300 American ships and 251 British naval aircraft, and a Commonwealth fleet including Australian, New Zealand, and Canadian ships and personnel. The Battle of Okinawa was also the occasion of the largest number of Japanese kamikaze, or suicide, attacks. More than twenty-six American ships were sunk in the waters around the island, and 168 severely damaged. Some 40 percent of the battle’s American casualties were sailors lost to those assaults.

Okinawa was “the most nightmarish experience of the entire Pacific war—over 12,500 soldiers and sailors killed, and the greatest number of combat fatigue cases ever recorded of a single American battle.” So writes Victor Davis Hanson in his introduction to With the Old Breed, a first-person account of the ground war in the Pacific by E. B. Sledge, a member of the 1st Marine Division. Suffering fifty thousand casualties during nearly three months of combat, the Allies fought a brutal, sometimes step-by-step war on the island, as the Japanese military withdrew into a labyrinth of caves and tunnels atop and within steep ridges. It was one of the fiercest battles of attrition in history.

My stepfather’s story does not appear in Tom Brokaw’s landmark 1998 tribute, The Greatest Generation, though it certainly could have. Without doubt, like the other subjects on whom Brokaw reported with such admiration, he was formed by the experience of World War II. But his story deserves sharing not only because, eighty years on, the war generation has almost vanished from the scene. Although I hardly knew it at the time, he also imparted without words a wider lesson about how to live with grace in a world so often full of tribulation—and sometimes outright evil. Now words are the only way left of honoring my late stepfather, whose circumspection about his war days I will observe posthumously by using his first name only: Roy. Or, as I knew him, Dad.

The future Private Roy was born three years before the crash, in 1926, in a hamlet at the northeast corner of Oneida County in Central New York, to a family of French-Canadian origin. He grew up in Steuben, another tiny town tucked under the Adirondacks. Named for Baron von Steuben, the Prussian officer appointed by George Washington to lead the Continental Army, this little village lies within the tract of 16,000 acres granted to the baron by the fledgling American government. His monument and tomb are a short drive from the now vanished little family farm where my stepfather grew up.

Dad’s mother, it was said, worked before her marriage as a maid at Kykuit, the Rockefeller family estate in the Hudson Valley. Her first husband died in the Spanish influenza of 1918, leaving her with two small boys; she married again a few years later and went on to have Dad, one more boy, and two girls. Dad’s mother and father spoke pidgin French, not English, and the children attended a one-room schoolhouse, still visible, albeit ancient, today. In deepest winter, my stepfather said, they would arrive there on snowshoes or homemade cross-country skis.

The farm lacked electricity for part of his childhood, and the landscape was topped with lake-effect snow for almost half the year. He and his brothers would arise at four o’clock and descend from their room in the attic to milk the family’s cows and feed whatever other animals were around. Their mother would come downstairs soon afterward, to make the day’s food from scratch, usually starting with johnnycakes, whose origins are traceable to the Pawtucket Indians. (His mother never did get used to mid-century technology. Later in life, convinced that since she could see the people on television, they could also see her, she would sit before it only if dressed in her Sunday best.) The family vehicle, during my stepfather’s early life, was a horse attached to a flatbed carriage; later they got a Model T. On Sundays, as a treat, the children would ride whatever transportation they had to a nearby town, often to pick up a new wheel of cheese. Dairy farming being the area’s main industry, family-run cheeseries were commonplace.

Dad’s family lived more like the frontiersmen of the nineteenth century than like twentieth-­century Americans. But no American of his age could escape the Second World War. And so, in April 1944—one day after his eighteenth birthday and less than a year from the first landings at Okinawa—a future Army infantryman showed up in Remsen, Central New York. According to his registration card, he was five feet eleven inches tall and weighed 157 pounds, with brown hair and gray eyes.

Private Roy enlisted on December 19, 1944. Following basic training, his unit was sent to Hawaii for more drilling, and finally on across the Pacific. In 82 Days on Okinawa, ­another first-person account, Army Colonel Art Shaw explains, “The army whipped us into line and turned us into killing machines in only thirteen weeks. Now we were all human torpedoes, butcher boys, gunmen.”

Many sources document the inferno that raged in 1945 on that island four hundred miles from mainland Japan. Crucible of Hell, by historian Saul David, delivers in its title the gist of these war stories. By some estimates, around three thousand people were being killed on Okinawa each day. Joseph Wheelan reports in Bloody Okinawa that on April 20 alone, the 27th Army Division—my Dad’s—lost 506 men, “the greatest one-day loss of the campaign by a division.” In all, a quarter of a million people perished on the spot known today as “Japan’s Island Paradise.” And staggering though their losses were to the Allies, Imperial Japan and the indigenous islanders under its control lost far more. More than 100,000 in the Japanese military were killed or committed suicide, as did a roughly equal number of Okinawan civilians.

Statistics alone cannot capture the savagery that became synonymous with Okinawa. Hundreds of thousands of soldiers armed with massive weaponry contested amid civilians on an island smaller than the city of San Antonio. The intense concentration of men and arms was also pounded throughout by nature, including heat, monsoons, and mud, as well as by miseries of infestation such as huge flies, bloated from feeding off bodies. The result was a relentlessly putrid, polluted war zone. As one sergeant observed of the deadly struggle over a rise nicknamed Sugar Loaf Hill, which cost the 6th Marine Division thousands of casualties: “We were fighting and sleeping in one vast ­cesspool. Mingled with that stench was another—the corrupt and corrupting odor of rotting human flesh.”

Horrors that didn’t have a name abounded. Here is Sledge after describing a scene in which enemy artillery shells pierced through soil and mud to upend newly buried Japanese corpses, ­scattering maggot-ridden body parts all over a band of American soldiers: “We didn’t talk about such things. They were too horrible and obscene even for hardened veterans. The conditions taxed the toughest I knew almost to the point of screaming. . . . [It was] preposterous to think that men could actually live and fight for days and nights on end under such terrible conditions and not be driven insane.”

Death loomed everywhere—underground, on the surface, in the skies. At sea, suicide planes and suicide boats menaced the Allies constantly. On land, American soldiers saw, and did, the barely imaginable. They poured napalm into caves full of holed-up Japanese fighters and ignited them with phosphorous grenades. They manned ­Sherman tanks that launched enormous spires of fuel, with a reach of up to eighty yards. “The flames couldn’t be put out before they were finished burning,” notes Shaw; thereby were some of the enemy roasted alive. Titanic artillery fire rocked the island day and night, even as much of the combat remained face-to-face, and at all hours, in bunkers, pill ­boxes, caves, and foxholes.

Some Japanese soldiers mutilated enemy ­­corpses. Some Americans desecrated bodies too, including by pulling katanas out of dead officers and keeping them as trophies. Nor was it only men in uniform who participated, willingly or no, in brutality. American troops watched in horror as native Okinawans, drilled in propaganda about American rape and murder, killed themselves pre-emptively by taking strychnine, or by strangling one ­another, or by jumping off cliffs into the sea; some also murdered their own families. Many thousands of Japanese soldiers died by suicide, especially during the last days of the war. Their methods of destruction were myriad: racing into American fire, falling onto a sword, being strangled by friends, or hugging grenades—a practice known as “poor man’s hara-kiri.”

So monstrous was the moral and human apocalypse of Okinawa that it proved a forcing crisis to two more acts of mass destruction. As death spread its dominion across land, sea, and air, President Harry Truman and his military withdrew their planned invasion of the home islands, rather than risk “Okinawa from one end of Japan to another,” as he put it. The cancellation of Operation Downfall undoubtedly saved the lives of hundreds of thousands, perhaps millions, of American soldiers (and Japanese soldiers, too). Its collateral damage was also fearsome, as two hundred thousand more lives—mostly civilians—were lost to atom bombs in Hiroshima and Nagasaki on August 6 and 9.

Even the end of the war itself, on September 2, 1945, could not erase the human devastation. For decades after the battle, reports Joseph Wheelan in Bloody Okinawa, farmers and construction workers routinely uncovered corpses still mired in the haunted soil. This was the netherworld that my stepfather, alongside his fellow soldiers and sailors and Marines and airmen, had entered on Easter Sunday 1945. Following the headlong plunge into death that was the Pacific Theater, peace descended just as suddenly, and to wild jubilation, as the war ended on September 2. Twelve million Americans in uniform returned home, and warriors were again made into family men. There was a marriage boom, and a baby boom, and picket fences were layered en masse, and speedily, over the trauma. One by one, Americans drained by the war turned away from Hades, and back to home and hearth. “For many [veterans],” as Brokaw observes, “the war years were enough adventure to last a lifetime.”

In this new domesticity, Dad’s life followed the script of the Greatest Generation. He returned to upstate New York in 1946 and married his high school sweetheart. In short order they had five children, including twin boys who would later be drafted during the war in Vietnam. For much of that time, the young veteran worked as a logger in Oregon, a region he particularly loved; he made a point of traveling to forty-eight of the fifty states, and he often told his children that Crater Lake was the most beautiful spot on earth. Following a fall from a redwood tree that broke several bones, ending his work as a lumberjack, he returned with his family to Central New York in the late 1950s—only to lose his first wife unexpectedly young, to cancer.

Sometime in the early 1960s, working as a handyman in a small nursing home near the hamlet where he grew up, this widower met my mother, a newly divorced nurse with two little girls. Two years later, they married, and they went on to have three more boys—including another set of twins, one of whom would become a career Marine. For the next three decades, my stepfather worked many jobs, often simultaneously, most of them blue-collar, scattered across a fifty-mile or so radius in rural Central New York—changing towns and hamlets as often as needed for work, or on a whim, or both, with my mother and their newly blended family.

Brokaw’s book speaks of the “self-reliance and gratitude” exhibited by the men he profiled. I would add, in my stepfather’s case, exceptional improvisation. Throughout the years, he worked variously, sometimes simultaneously, as an auto mechanic, carpenter, electrician, mason, bartender, and fix-it man. I have never known anyone as competent as he was with tools of every kind. When a back injury took him out of the heavy labor force for a year, he pivoted once more, and moved us to the upstairs of a historic, if dilapidated, tavern that he ran with my mother for a while on a defunct branch of the Erie Canal.

Adaptability, flagged by Brokaw as another characteristic of the war generation, was matched in my stepfather’s case by social energy, especially outdoors. At different times, he was an amateur stock car driver, a motorcycle enthusiast, a farm team baseball player, a square dance caller. For years he coached both my brothers’ Little League team and the softball team to which my sister and I belonged. He took all of us fishing in the region’s abundant lakes and creeks, and if we didn’t learn patience from him, it wasn’t for want of example; no one was ever more content to stand by water’s edge for hours on end, waiting. Like most men in the region, he also hunted occasionally—but only if he kept the kill for food. It was said that the former Private Roy was an exceptional shot.

He was a man’s man, in the best sense, with infectious confidence, a favorite among his peers in an age when manliness was prized. Even in explosive situations, he maintained an epic cool. Once, when a local boy stepped out of line with one of his kids, Dad called the offender and his father into our kitchen as I hid around a corner, watching. As the nervous, unwilling guests entered, my stepfather took his Buck hunting knife from its sheath, placed it on the table with no explanation, and informed them both in a low voice that there would be no more transgressions. And there weren’t. (This scene, buried in memory for decades, resurfaced only recently, when I first heard Rodney Atkins’ popular country song, “Cleaning This Gun”—which makes the same point.)

Like his fellow veterans, my stepfather’s postwar course, and my mother’s, too, were worlds removed from the collapse of American community described in Robert Putnam’s 2000 study, Bowling Alone. Though we moved ­often—nine times in thirteen years—my parents were popular in every village and town where we landed, and they ran in convivial circles. Since most families in that time and place couldn’t afford babysitters, the adults’ social life was ours, too. Card games like pitch, poker, hearts, and rummy, played with rotating family and friends of all ages, took us through long winter evenings. Cigarette smoke was everywhere, a toxic if indisputably common bond. Johnny Cash eight-track tapes ruled in the car. Church was another constant of the landscape, wherever we were; my brothers were altar boys, my sister and I sang in choirs. We also marched in a drum and bugle corps that paraded through quaint tiny towns.

In summertime, like other local families, we were regulars at “field days”—open-air festivals of rides and carnival games by day that turned, by night, into dancing and drinking marathons. And though other kids sometimes dreaded the evening hours, especially those whose fathers or stepfathers were drunks, our clan never had to worry. Thanks to Dad, we were always safe.

“They were proud of what they accomplished,” writes Brokaw of the veterans, “but they rarely discussed their experiences, even with each other.” Neither did my stepfather, for the most part, so the handful of exceptions bear mentioning.

One was a story about that training time in Hawaii, as he prepared with thousands of his brothers in arms to ship out. Toward the end, he told me, soldiers were granted some free time, as a break before heading off to war. Most of the guys he knew spent those unsupervised days and nights as one would expect—drinking, gambling, chasing girls. But for some reason he never offered, maybe because it couldn’t be explained, Dad chose to devote his leave time to something else: optional extra lessons in hand-to-hand combat, from an Army veteran in Hawaii who offered them on the side. Those lessons, he believed, saved his life in the foxholes to come.

He was wounded in Okinawa, lightly, three times—twice by shrapnel and once by a bullet that passed clean through his hand. Despite that, only one physical “tell” of the war’s trials remained for the rest of his life. Days spent in wet foxholes led to jungle rot, which in turn morphed into chronic psoriasis on the soles of his feet. They were excruciatingly sensitive, and everyone at home and work knew not to pass near them, even when they were protected in the thickest of steel-toe work boots.

In other ways, my stepfather departed from the generational script. Historian Allan Nevins asserted famously in 1946 that “probably in all our history has no foe been so detested as were the Japanese.” Yet oddly enough, and despite Okinawa, one would not have known this from listening to Dad. He made the point that he admired the discipline and courage of the Japanese soldiers—and he volunteered that he never took a katana out of any dead fighters, from respect. In 1980, when the televised version of James Clavell’s Shogun, a drama about Imperial Japan, became a massive hit, I gave him a copy of the book because of his fascination with things Japanese. Dad’s last grade of education was ninth. It was the only volume I ever knew him to read all the way through.

Dad also diverged on another point. Though the postwar years were marked by a religious boom that persisted into the 1960s, he bucked this trend. To be sure, he was nothing if not Catholic; he did not lean toward any other sect, and he believed, and imparted, that the Church taught truth. But rarely did he attend Mass. (Neither, for many years, did my divorced mother, until being granted an annulment.) By way of explanation, he would only say cryptically that he and God—I am quoting from indelible memory here—had “come to an agreement in a foxhole.” Even so, he saw to it that the children of the house attended regularly, including on holy days of obligation, and that we participated in parish life as well.

Like any child, I took the status quo of our household for granted, including its religious ­paradigm—in our case, that kids go to church and that parents who don’t will nevertheless insist on Catholic rules. Not until many years later would I realize how far from normal our family’s religious regimen was.

Though individual experiences were downplayed by adults of that time, the war itself was hardly omerta—far from it. Many of my parents’ friends were veterans. Though I don’t know of any who survived alongside my Dad, I do recall one who’d served in the European theater showing hushed respect in his presence, shaking his head gravely on hearing that word, “Okinawa.” Even so, my stepfather exhibited no visible signs of trauma; he was neither a brute, nor an addict, nor a recluse. A rare exception to his calm broke through in the early 1970s, when my mother reported his suffering a few intense episodes of hallucinatory night terrors. During them, he was insensible to all else for some long minutes, she said. The script was the same each time: sitting up suddenly in the dark, yelling the names of men she’d never heard of before, and begging those companions to “get up, get up!” Dad claimed no memory of these events. My mother believed, and she was surely right, that they were triggered by the drafting of his older two sons, one of whom had just been sent to Vietnam.

One last detail of war shared by my stepfather concerned his unit’s munitions bearer, known to us only as George. A black man, George was assigned to support services, including stretcher-bearing. (The U.S. Army would not officially integrate ­African-American soldiers into combat roles until 1948, when Truman signed an executive order mandating the desegregation of the armed forces. The last segregated units were not dissolved until 1954.) My stepfather always referred to George in admiring tones—in fact, “George” was the only name I recall his sharing with us from his war days. He mentioned more than once that George had courageously carried away wounded soldiers under enemy fire, and he told us—also more than once—that in his opinion, black American soldiers should have been fully armed and trained.

Though there’s no way of knowing for certain, that friendship under fire between my stepfather and George might just have had something to do with one other chapter of Dad’s postwar life—in some ways, the most remarkable of all.

In the mid-1970s, my stepfather settled into the longest-running of his many jobs: head mechanic at a “youth camp” tucked away in a sylvan corner of Oneida County. The bucolic phrase ­amounted to window-dressing for a minimum-security pre-prison of sorts for boys aged fourteen to seventeen, who had been sent up on various criminal charges. Most were from New York City, nearly all were fatherless, and a majority were brown or black. Too young for prison and too problematic for society, these young offenders were being held in “camp” for sentences of a year or so, in hopes of rehabilitation.

Some of the rural-born men charged with supervising these rejects of New York City held the kids in contempt—but not my stepfather. In another turn I took for granted as a child that in retrospect cries out for reflection, Dad became something else: the camp’s de facto mentor-in-chief, teaching not from the classrooms where campers were counseled by social workers with college diplomas, but instead in the garage, the place some boys seemed to like best of all. There, the practical skills he’d picked up through decades of varied work were imparted to willing campers. The list in full over the years can’t be captured here. But several of the boys whom my stepfather took under his wing, episodically signing them out of custody and into our homes for dinner, as a treat, remain vivid in memory.

One was Diamond, a small, scarred Puerto Rican boy from the Bronx. He was caught at the age of fourteen having stolen $4,000 worth of loot (about $23,000 today), to qualify for membership in the local gang. Diamond was quick to learn about engines, Dad reported. With no real home to go back to, he ended up staying longer than most in the camp and hence had extra time with my stepfather. One of the nights Diamond came to dinner, my mother surprised him with a cake for his fifteenth birthday. On seeing it, as I will never forget, this tough son of Gotham broke down crying like a toddler. No one had ever celebrated his birthday before. Eventually, Diamond returned to New York and worked as a mechanic for some years.

Another of my stepfather’s proteges was an ­African-American boy named Lloyd. Born and raised in Harlem, he was one of the few campers with no real criminal record. A fearful mama’s boy, by his own description, he was sent to camp for truancy after refusing to join the local gang, whose leader had threatened to break both his thumbs. My stepfather tutored this protege in tools, including how to use a chainsaw to fell trees. (Though as Lloyd pointed out in our house one afternoon, if he took that skill to Central Park, he’d be arrested again.) Humorous and outgoing, Lloyd was a family favorite, and he, too, stayed in touch for years after returning home.

Not all the campers fared well. Jed, a white boy from the Hudson Valley, whom my stepfather assessed as possibly the most skilled thief in all the institution, also hung around the garage. Shortly after being discharged, he was arrested again for something serious and sent on to real prison, not camp. Then there was Jimmy, a thin African-­American boy from Bedford-Stuyvesant, at that time one of the worst neighborhoods in all New York. Quiet and observant, with knife scars on his face, Jimmy shadowed my stepfather for months and was another sometime guest in our home. Dad thought Jimmy promising and talked to him, and about him, often. Then, the night after returning to the city, Jimmy was killed in a knife fight over a card game. Half a century later, Dad’s grief on hearing that news reverberates. One death too many, Jimmy’s end marked a close to the itinerant hospitality in our home of some of New York City’s spurned sons.

Early in the 1970s, some of the boys rioted, momentarily imperiling the extended property and its authorities. Throughout that day, fires were set in the main building and elsewhere, and c­ampers armed with improvised weapons roamed the grounds. Their ringleader, an imposing young man named Rico, entered the camp’s garage without permission along with several others, only to find my stepfather at his metal desk amid the camp’s vehicles. 

The potential for destruction in that moment must have been prodigious. Buses and trucks were on hand to be stolen or vandalized; tools could easily have turned into weapons; and all that stood between the boys and the garage was a single, ­unarmed man—Dad. As he told the story, a Western-­type standoff ensued in the garage that day. Rico surveyed the scene in silence for a few tense moments while his followers stood still, waiting for a sign. Finally, Rico ordered, “This is Roy’s place. Leave it alone.” And so the marauders did, rendering the garage the only major site left unmolested.

The shepherding of so many boys remains an extraordinary record of voluntary, unremunerated mentorship—the more so for Dad’s patent indifference to race, or color, at a time when indifference was anomalous. Nor did he seem perturbed by the unsavory circumstances that landed his charges upstate in the first place. On the contrary, he mixed the boys from time to time among his own wife and children—as effortlessly and as unconcernedly, one might say, as he mixed my sister and me with his own kids, along with the parade of neighbors and friends and other stray souls who came and went from our homes, wherever we were living at the moment.

My stepfather likely never encountered the Jesuit motto, “a man for others.” He just lived it. At eighty-six years old and in his right mind, as ever, he took himself off dialysis, knowing the decision would end his life within weeks. He filled those remaining hours surrounded by family and friends. A few days before dying, he had breakfast with an old buddy and told him with a grin, “Vinnie, this is it! This week I’m going to heaven.” No one listening had a dry eye. Except Dad, unflappable as usual.

Not long ago, I read for the first time J. Glenn Gray’s classic 1959 study of the psychology of combat, The Warriors: Reflections on Men in Battle. I was especially taken by the author’s thoughts on what he calls the “ache of guilt” carried by soldiers after war. That ache, Gray observes, has far-reaching consequences. Some men are driven mad by what war has made them do. Others follow a different route toward oblivion, and long for death themselves. Still others escape by throwing themselves into sensualism, a life of distracting pleasures. In measuring those possibilities against my dad, I find that none fit the bill. But there was one other response remaining on the list.

If a soldier is strong enough, says Gray, “Atonement will become for him not an act of faith or a deed, but a life, a life devoted to strengthening the bonds between men and between man and nature.” I thought back to Brokaw’s book about my Dad’s generation. Atonement: That word goes missing in action from the Greatest Generation narrative. Yet it might, in the end, help to explain not only my stepfather’s vibrant postwar years, but by extension, those of the many other souls scarred in the Second World War whose lives became so justly admired.

After all, with atonement comes grace; and grace, more than any other factor, is surely the invisible filament binding together the pages of my stepfather’s story—a grace that appears even to have pierced some of those around him, protecting them against harmful rays of the age. Because of Dad, small knots of working-class people and their kids found genuine community. Because of him, some thrown-away boys, abandoned by everyone else, had a shot at rejoining society. Because of him, a single mother gained a second chance at marriage and family; and two girls who weren’t even his, one of them this author, would dodge the grim arithmetic of a ruptured home and know a father’s steadfast love.

What, besides grace, can make sense of Roy’s life? Somehow, inexplicably, this rural white man of his time remained completely ­untainted by racial prejudice. Somehow, just as unlikely, boys intent on violence one day passed him by. ­Somehow, mirabile dictu, a bunch of kids whose parents didn’t attend church were raised ­Catholic. And somehow, a soldier who suffered and took part in one of the grisliest battles on modern ­record retained not only a lifelong respect for his enemy, but in the end, a certainty about his own reward.

With all due appreciation for Tom Brokaw, The Greatest Generation misses something essential. Maybe the remarkable accomplishments of yesteryear’s veterans, many as unknown to the larger public as my Dad’s, were driven in part by something unseen. That supernatural evil has a hand in the destruction known as war has never been hard to entertain. Less visible is something we’re told is also true: that where sin abounds, grace may come to abound more.

Neither I nor most of those reading this have served in combat or been driven to kill—let alone been part of an annihilation like the Battle of Okinawa. But all of us, in the community of sinners, are just as much in need of atonement and charity as were the soldiers described by Gray. And maybe that is the final lesson left by the Greatest Generation, a lesson that soars beyond the war years, and the postwar years, into eternity: ­Only love, and love’s propitiation, make possible for any of us a shot at that ultimate victory, redemption.

The post Finding Private Roy  appeared first on First Things.

]]>
Protestants Against the Pill https://firstthings.com/protestants-against-the-pill/ Mon, 22 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=102976 Ben Jefferies is an Anglican priest who says he knows that one of his parishioners throws away all the tracts he’s written on “Marriage, Sex, and Babies” when he’s not looking. He keeps them in the lobby...

The post Protestants Against the Pill appeared first on First Things.

]]>
Ben Jefferies is an Anglican priest who says he knows that one of his parishioners throws away all the tracts he’s written on “Marriage, Sex, and Babies” when he’s not looking. He keeps them in the lobby of his church, alongside a number of other tracts on “things Anglicans believe.” Jefferies laughs good-naturedly when I ask how his parishioners receive his teaching on contraception. His own belief on the topic, though informed, differs substantially from what most ­Anglicans believe, at least in practice. In fact, ­Jefferies’s teaching on the matter is similar to that of the Catholic Church, proffering Natural ­Family Planning paired with ­periodic ­abstinence as the standard means by which Christians should avoid ­pregnancy.

The Catholic view used to be, well, catholic. Martin Luther and John Calvin regarded contraceptive sexual acts as a grave moral sin; this was the universal Christian position until the 1930 Lambeth Conference, at which Anglican leaders gave their official opinion that contraception was not in all cases sinful. Other denominations quickly followed. But though Protestants on the whole have left behind contraception as a moral issue, a growing number of Protestant women have begun to reject the pill. I interviewed three who are discovering more than Natural Family Planning—they are discovering an embodied faith.

Kelsey Meyers, twenty-five, is a new wife, mom, and lawyer who attends an Anglican church in Washington, D.C., with her husband and infant son. We both fed our babies as we talked on the phone.

Kelsey had been prescribed the birth control pill as a treatment for hormonal acne when she was in high school, and again later when she approached her doctor with symptoms of polycystic ovarian syndrome. But she didn’t like how the pill made her feel, and it didn’t seem to address her symptoms. “Every time I came with an issue, that was the Band-Aid solution that they slapped on it.”

Kelsey’s experience isn’t unique. Women are prescribed hormonal birth control for many issues: acne, mood swings, irregular or painful or heavy periods. In the United States, hormonal birth control is prescribed to girls as young as eleven, and there are neither federal age restrictions for its use nor longitudinal studies of its effects on girls who have yet to undergo puberty.

Once she got engaged, Kelsey researched alternatives to hormonal contraceptives. Catholic friends encouraged her to read Taking Charge of Your Fertility by Toni Weschler and The Genesis of Gender by ­Abigail Favale. Kelsey began to use the Tempdrop fertility tracker, which uses basal body temperature to track ovulation. She felt that it helped her better understand her body, including some of the symptoms for which she had approached her doctors. “I kind of came to realize that a lot of these things that doctors had told me were wrong with my body, like these longer periods and these longer cycles, were actually just my body operating normally.”

Kelsey began to question the morality of contraception. Whereas Catholics have a well-developed teaching on marriage and sex, including a prohibition on all forms of contraception, the nondenominational Protestant churches in which Kelsey grew up never taught on contraception. Across Protestant churches and denominations, there exists little to no engagement with the morality of contraception or with the Catholic arguments against it. “I would just love to see more Protestant women discussing what a consistent ethic with this is,” Kelsey told me. “I don’t have a consistent theology right now behind it. I’m still learning. But having the conversation is important.”

Chaney Gooley, thirty-­three, is one of the few Protestant women I know who refused the use of contraception on almost purely theological grounds. She attends an Anglican church in Alexandria, Virginia, with her husband. But like Kelsey, she first began questioning the morality of contraception due to questions about health: not for herself, but for her future baby.

Through her volunteer work with a pro-life sidewalk ministry, Chaney learned about the potential abortifacient effects of hormonal birth control methods, including the pill. “I would never want to do anything to cause a baby to not have the nutrients it needs to implant in my womb,” she told me on the phone as I watched my children play in the front yard on a warm afternoon in March.

Birth control pills and IUDs work in three main ways to prevent pregnancy: a primary means and two secondary backups. First, they suppress the hormone that triggers ovulation, reducing the likelihood that an egg will be released. Second, in the event of “breakthrough ovulation,” they thicken the cervical mucus that carries the sperm to the egg, providing a physical barrier. Finally, as a fail-safe, they thin the lining of the uterine wall to prevent the implantation of a fertilized egg. “In moral philosophy, that’s an abortion,” Jefferies told me. Breakthrough ovulation—the release of an egg despite the use of ­hormonal birth control—varies in ­frequency from woman to woman. Nearly half of all women on the mini-pill (a ­progestin-only pill) continue to ovulate. Breakthrough ovulation is somewhat less frequent in other forms of hormonal contraception, but women who use it while sexually active for several years are likely to have at least one instance of breakthrough ovulation.

Most pro-life women don’t ­realize the potential abortifacient effects of hormonal birth control. This is partly because medical experts have changed the definition of “pregnancy.” In 1965, roughly concurrent with the development of the pill and the IUD, the American College of Obstetricians and Gynecologists updated its definition of “pregnancy,” which was now said to begin at implantation rather than fertilization. Hormonal contraceptives were therefore classified as birth control rather than abortifacients. Previously, in medical and lay contexts alike, pregnancy had been understood to begin at fertilization, the moment when a unique organism is created. Just a dozen years before the change in definition, Watson, Crick, and Franklin’s identification of DNA as the chemical signature of a unique living being had brought new insight ­into that moment of beginning. The change in definition denied a reality that was being understood with greater exactness every year.

Chaney also takes issue with the way hormonal contraception ­changes the female body. She calls it “antifeminist” to make women’s bodies “more like the male body”—that is, unable to conceive. She thinks that fertility cycle charting should be taught to girls as they enter puberty, to foster “body literacy.” And as a woman with endometriosis, she believes that Natural Family Planning can help diagnose reproductive diseases earlier. (Endometriosis, a known cause of infertility, takes on average seven to nine years to diagnose, despite affecting 10 to 15 percent of women of reproductive age. Symptoms of endometriosis are most often treated with the pill, despite the fact that it does not cure or even curb the underlying condition.)

In addition to gaining a better understanding of her body as God created it, Chaney has realized that there is “a deeper spiritual meaning to keeping the unitive and the procreative purposes of sex united”: She and her husband practice continence during her fertile window when they prefer not to conceive, yet they leave open the possibility of children when they do come together. She compared the marital act to participation in Eucharistic union with Christ: “We are to take and eat and to be united with him, and whether we believe that’s symbolic or literal or somewhere in between, there’s a very real sense in which he has given himself to us on the marriage bed of the cross.”

Ben Jefferies, like ­many Protestants I’ve talked to, found his way into thinking about contraception through the writings of Pope John Paul II, which he praises for expressing a poetic and poignant view of marriage, despite “being written by a chaste man who was never married.”

Though the Anglican Communion broke with the broader Church’s position on contraception at its Lambeth Conference in 1930, Jefferies reads its resolutions as more aligned with historic Christian teaching than most modern Anglicans do. He reads them as explicitly sanctioning only condoms (since hormonal methods did not yet exist), and only when there is, in his words, a “clearly felt moral obligation to limit or avoid parenthood and . . . a morally sound reason for avoiding complete abstinence.” (Interestingly, Jefferies’s reading of the Lambeth Resolutions on contraception is similar to the stance recommended for the Catholic Church by the Pontifical Commission on Birth Control in the 1960s.) Though ­Jefferies thinks there are times in the lives of many married couples when both of these provisions are met, he explains that those times are few, and that abstinence (perhaps paired with a fertility awareness–based method) is the ­only licit means for Christians to avoid conception at all other times.

In defense of his pastoral allowance for the use of barrier methods when circumstances warrant, ­Jefferies told me that he believes that Pope John Paul II’s phenomenological interpretation of the body does not take “sufficient account for the degree of depth to which things have become broken and muddied by the Fall.” He mentioned specifically the emotional, financial, and health implications of a couple’s having a child “every twelve to eighteen months for a twenty-year window,” which may be the ideal in a perfect world but quite difficult to achieve in our fallen state.

He thinks the Lambeth approach of permitting certain contraceptive methods in some cases at some times is more fitting. “I do sometimes wish that we had a stronger magisterium,” he told me. But he regards “trusting the Holy Spirit to be the teacher” as a more patient and persuasive pedagogy, “so the actual pastoral outcome is comparable or better to what the Roman Catholic Church is getting with their ironclad magisterium.”

Brooks Anderson is a forty-­five-year-old birth doula, graduate student, and “not-so-stay-at-home stay-at-home mom” to five children in Albuquerque, New Mexico, where she and her family attend a Presbyterian church. She and her husband have primarily used Natural Family Planning for the nineteen years of their marriage. Her youngest child was born when she was forty-three.

Brooks is the eldest of eight living children. Her parents were influenced by the Quiverfull movement, which teaches that because children are a blessing from the Lord, Christians should try to have as many as possible. Brooks remembers with awe her mom’s pregnancies and births, which impressed her with the miraculousness of the female body.

Brooks fondly described watching her younger sister’s birth when she was six years old. “The doctor put gloves on me and let me feel the placenta and explained how a baby lives and the way that God designed it.” For one of her brother’s births, she cut the umbilical cord. She remembered less fondly her mom’s last pregnancy, at forty-four, when Brooks was a junior in college: “You can’t get away from knowing that your parents are still having sex because there’s your mom, waddling around campus.”

Brooks witnessed five of her seven siblings’ births, including the ones that nearly claimed her mother’s life. Brooks’s parents’ theology has shaped her. But so has the difficulty of her mother’s pregnancies. “I really resented [that] my parents [kept] having children” despite her mother’s rare blood disorder and severe morning sickness. “I was like, ‘I’m not going to do this to myself, right? I’m going to respect myself.’”

She recalls writing a college paper on the question of whether Christians should use birth control, and in the course of writing she encountered Catholic theology of the body. She also learned about the negative effects of hormonal birth control on the female body. Though she ultimately rejected her parents’ maximalist Quiverfull approach, she realized that whether or not to use birth control was a question that merited prayerful consideration. A few years later, she discovered Natural Family Planning, and she came to appreciate the way it honored both the female body and God’s design for marriage and children.

She reflects with satisfaction on the times in her marriage when she and her husband used NFP: “We were not holding sex in a controlling way over one another, and we were both taking our desire, fear, whatever to God individually.”

A 2020 study by the Catholic Medical Association shows a 58-percent decrease in the likelihood of divorce among couples who have used Natural Family Planning. Other studies show an even greater association between NFP and marriage stability. Whether that association is correlative or causal, Brooks emphasized the need of husband and wife to be equally yoked—to be on the same page about these things, and for neither to pressure the ­other—especially since NFP requires self-control in the form of periodic abstinence.

As her children vied for her attention on the playground, Brooks told me that she had never heard a pastor discuss contraception from the pulpit. “Everything that I had always been taught had been geared towards me as a woman, like you submit to your husband. You don’t say no to sex. Don’t deny your ­husband. You make yourself available. It was never taught from a place of mutual responsibility and mutual honor.”

She contrasted that messaging with advice she heard from a Catholic priest: When looking for a spouse, look for someone with self-control, the ability to fast, because such a person “will be able to do hard things that will be necessary in marriage.” Brooks elaborates: “If you truly do Natural Family Planning, you have to say no to your own desires at times . . . and recognize that denying ourselves draws us closer to God.”

It may seem implausible that something as private and, well, human as a menstrual cycle could draw a woman into a deeper relationship with God, with her husband, with her own body. It’s no surprise that Jefferies’s tracts on contraception keep going missing. Many Christians prefer to keep God out of the bedroom. But Christians believe that Jesus submitted himself to the confines of a womb, a womb that underwent the same physical changes some Protestant women are beginning to embrace as part of their embodied faith. And as Chaney told me, it’s hard to keep God out of the bedroom when you keep a crucifix above your bed.

The post Protestants Against the Pill appeared first on First Things.

]]>
A Life Worth Hacking https://firstthings.com/a-life-worth-hacking/ Fri, 19 Sep 2025 11:00:00 +0000 https://firstthings.com/?p=102925 Earlier this year, I traveled to Texas to spend a few days with a few thousand people who were convinced that they were going to live forever. Or, if...

The post A Life Worth Hacking appeared first on First Things.

]]>
Earlier this year, I traveled to Texas to spend a few days with a few thousand people who were convinced that they were going to live forever. Or, if not exactly forever, then at least for a very long time and in very good health.

How? Through the good graces of biohacking, which a banner on the conference’s elevator door helpfully explained is “the art and science of changing the environment around you and inside you, so you have more control over your own biology.” 

Some of the contraptions on display were heavy on both the science and the control. The Ammortal Chamber, for example—retail price $159,500—is a wonder of modern engineering, resembling a prop from the Alien movie franchise. The occupant is treated to everything from 600 milliliters per minute of molecular hydrogen to red light from hundreds of little lightbulbs (at a wavelength of 660 nanometers, if you must know). A twenty-minute session, I can attest, leaves you focused, relaxed, and recharged­­—which is why the Los Angeles Dodgers keep a few in their locker room. 

Other offerings at the conference were less meticulously engineered. I was fascinated by the self-described “conscious DJ” whose workshop focused on “activating the frequency of Earth 2.0,” but I didn’t reach for my wallet. Butter-infused coffee shakes, electromagnetic pulse belts for your dog, and a plethora of pills promising to do everything from tricking your stomach into thinking it’s full to tricking your body into believing it’s young—there was no shortage of offerings that felt, even to the most open-minded newcomer, like hokum at best or a pernicious hoax at worst. 

New Age affectations aside, biohacking is no laughing matter, and people of faith in particular should pay attention to and take heart in the movement’s rapid growth. How rapid? Valued at around $24 billion last year, the biohacking industry is expected to climb to a market size of somewhere around $70 billion by 2030, which should come as no surprise. A struggling traditional healthcare system, an aging population in which older adults are projected to eclipse children in the next ten years, and a robust tech sector that finds all areas of human life ripe for disruption: these trends make for excellent conditions to encourage the building of machinery that addresses the most pressing biological challenges of the human species.

But financial returns alone are no reason to cheer on biohacking. Look deep into the heart of the movement, and you’ll find three reasons for wild optimism, suggesting that, maybe, the biohackers herald the coming of a new American Golden Age.

First, better than any other group in recent memory, they embody the all-American self-­starting spirit, with their own bodies as the new frontier. While most of us interact with technology by pressing buttons on social media platforms or doomscrolling mindlessly on our smartphones, content to follow the dictates of algorithms we don’t really understand, the biohackers are cut from different cloth. They buy—or, often, build—imperfect tools, download or swap intricate user manuals, and flock to online forums where they share their rigid regimens with each other. Whether or not these tools and regimens actually work is beside the point; what matters is that biohacking is increasingly becoming a spirited alternative to the convenience-addled consumerism that defines so much of modern life. The last time so many tinkerers tinkered with such intensity and such a sense of purpose, we were treated to the birth of Silicon Valley.

The digital revolution, however, was a revolution of the mind; biohacking is, first and foremost, an uprising of the spirit. Had you closed your eyes and listened to the keynote speakers at the conference, you might’ve thought that you were in church. Hal Elrod, for example, one of the movement’s philosopher kings, not only shared the story of his remarkable bout with cancer but also delivered a methodology for life—reflection, gratitude, and all that good stuff—that would’ve caused any priest, rabbi, or imam to nod approvingly. And Martin Luther King III received a big round of applause when he argued that his father was a biohacker, too, teaching people to choose love and goodness over more ­dopamine-inducing sensations like lust and rage. 

The second great virtue of the biohacking movement follows directly: Its members understand and celebrate the fact that biological improvements must begin with the soul. A person who is uprooted from community and tradition may snack on supplements all day long and take cold plunges on the hour and still fail to reach anything approaching true wellness.

Which brings us to the third and most crucial reason to feel grateful for the biohacking enterprise: It puts the focus squarely where it belongs, on the human body. In many ways, the history of the last fifty years has been the history of disembodiment. In the marketplace, sinister corporations have been working assiduously to reduce us to avatars and replace old-fashioned human interaction with virtual pokes, snaps, and chats. In the public square, so-called “progressive” political forces have waged devastatingly effective campaigns designed to contest the inimitable nature of flesh and blood. First, they tell us that abortions are a choice rather than a termination of a body, and then that biological sex, encoded into every cell of our bodies, is a fiction that can be refused and then replaced on a whim with any number of ludicrous fabricated categories, from “pangender” to “agenderflux.” 

The lies have been many, but their aim is always the same: to deny the human body its singularity and its sanctity, and to enforce instead a dubious theology that appoints us all as creators unto ourselves, free to pursue ecstasy by any means necessary. Biohacking, though not a conservative movement in any political sense, rejects this fevered notion of gnostic self-creation. The message is consistent: Pay attention to muscles and sinews, limbs and spine, liver and heart. This requires not only admitting that you are an embodied being, but celebrating it as well. The core ambition of biohacking is to learn how to enhance the life-giving potential of our bodies rather than trying to free ourselves from nature’s endowment. 

Hillel the Elder would’ve approved. One of the Talmud’s greatest sages—you may know him from such famous sayings as “That which is hateful to you, do not do to your fellow”—may have been a biohacker avant la lettre. As legend has it, one day he promised his students to teach them a very elevated lesson in spirituality before promptly rushing to the bathroom and locking the door. When he reemerged, the students, stunned, asked him precisely what they were supposed to learn from watching their great teacher relieve himself. Hillel didn’t skip a beat. “If I didn’t do what I just did in there,” he replied, “I wouldn’t have been able to teach you anything at all.” Because a body that can’t relieve itself, he realized, also can’t pray. Or, put bluntly, to save your soul, you need to respect the reality of your body first. 

Our nascent biohackers are doing just that. They’re monitoring oxygen levels and wearing special shades blocking out blue light, but they’re also thinking seriously about what makes life worth living, and the answers they stumble upon are often the time-honored and correct ones. Spending three days in their midst, I heard more about trusting in a higher power, caring for each other, and keeping away from all forms of corruption, physical and metaphysical alike, than I have anywhere outside of my synagogue. And I felt, too, a strong sense of budding community. All those fit and friendly people may have come to the conference primarily to care for their own aging bodies, but they realized somewhere along the way that aging, if done properly, is a team sport. 

Which is a tremendous opportunity for us faithful. Think about it: Tens of thousands of America’s most resilient, optimistic, driven, and—biohacking gadget prices being what they are—wealthy people are sharing our passions and our convictions. We’ve much to offer them, from a warm handshake (in contrast to our self-appointed elites who insist that health and wellness must be left exclusively to the state’s zealous official experts) to an invitation to plug into the millennia-old wisdom that already guides our lives. And they’ve much to offer us, too, reminding us that very little by way of transcendence will occur unless we first get ­serious about caring for the flesh.

The post A Life Worth Hacking appeared first on First Things.

]]>
Bright Girdle Furled https://firstthings.com/bright-girdle-furled/ Thu, 18 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=103307 Light on Darkness restores liturgy to its place at the heart of the medieval world. Like a Jesse tree, its trunk sprang from roots in the deep soil of antiquity, and its branches...

The post Bright Girdle Furled appeared first on First Things.

]]>
Light on Darkness:
The Untold Story of the Liturgy

by cosima clara gillhammer
reaktion, 256 pages, $25

Light on Darkness restores liturgy to its place at the heart of the medieval world. Like a Jesse tree, its trunk sprang from roots in the deep soil of antiquity, and its branches reached toward every corner of the heavens. When we read any writer of the Middle Ages—poets, theologians, historians, philosophers—we read words shaped by the liturgy of the Church. It was the common experience of every Western Christian, be he king or knight, priest or peasant. Those who knew Virgil and Cicero, those who knew Duns Scotus and Augustine, those who knew Burnt Njal and Egil Skallagrimsson, and those who knew no more than the half-pagan folklore of their homes, all knew the Mass in its seasons and variations. The liturgy was the chief form in which they encountered the faith they all professed.

In this book, Cosima Clara ­Gillhammer—an Oxford scholar of Middle English—explores how the medieval liturgy structured and held both the great story of salvation and the particular stories of individual lives, joining the one to the other in words, music, and movement. As she writes in the Introduction: “Funerals give a shape to mourning just as weddings give a shape to joy.” We read—­reproduced in full, not merely in quotes—­familiar liturgical texts such as the Reproaches and the Exsultet alongside those we now know chiefly from music, such as the Dies Irae and the Stabat Mater, and medieval poems most of us have never encountered before. Arranged by emotions and ­experiences—hope, suffering, grief, and so on—the book illuminates unsuspected depths of feeling: “The liturgy is not merely a set of abstract and symbolic words and gestures, but an act that engages with the whole breadth of human emotion. It reaches to the bottom of human experiences because it is an ­expression of the ­belief that God himself experienced them during his life on earth.”


These comments are reminiscent of a point made by Church historian Christopher Dawson about the liturgy’s legacy. Speaking of Renaissance humanism, he wrote: “Humanism was, it is true, a return to nature, a rediscovery of man and the natural world. But the author of the discovery, the active principle in the change was not the natural man; it was Christian man—the human type that had been produced by ten centuries of spiritual discipline and intensive cultivation of the inner life.”

The liturgy was the principal means of that cultivation, weaving together the feasts of the Church, the cycle of the earthly year, and the great events of individual lives into one seamless tapestry of observances. Gillhammer’s book centers more on feasts than, as theologians tend to do, on the structure of the Mass, for the simple reason that feasts loomed larger in medieval experience. ­For us, a feast is a Mass with different color vestments, which, if we’re very lucky, won’t simply have been moved to the nearest Sunday. In the Middle Ages, a “holy day” meant, in addition to its religious significance, at least a day and a half off work. There might be a vigil fast, a procession, a series of civic ceremonies. The whole of ordinary life was suspended to mark the feast, and the year contained dozens of such holy days.

The liturgical element of the celebrations was much more varied. The Divine Office played a far larger part in the lives of lay Catholics than it does now, when its communal celebration is confined primarily to religious orders. The best modern analogues are Anglican cathedrals. There, the services of Matins and Choral Evensong are at the heart of the liturgy, a time-hallowed ministry of music and beauty that draws more people annually than any of the churches’ official missions. There, as in the medieval Church, you will hear the canticles (Magnificat, Nunc Dimittis, Te Deum, and so on), the versicles and responses, the OAntiphons, and the Psalms. A devout Catholic could easily live a lifetime without hearing any of these except a rather limited selection of the Psalms, their fullness truncated and their beauty lost in translation. Lost, too, are the Stabat Mater and the Dies Irae, cut from the liturgy by reforming councils. Tenebrae is rare outside religious houses. Only the Good Friday and Easter Vigil services still retain a sense of the splendor that once accompanied feasts we now barely notice: St. Luke the Evangelist, St. John the Baptist, St. Michael and All Angels.

The proper celebration of feasts extended far beyond services, shaping the cathedrals themselves. At Wells, passages were built into the west front to allow choristers to sing from on high, their voices floating down as if angels in heaven were joining the service. At Salisbury, which was never a monastery, a huge and soaring cloister allowed the numerous processions specified by the local Sarum Rite to be celebrated in all weathers. In December, boy bishops and canons would be elected from among the choristers, exercising control of the cathedral until Holy Innocents’ Day, and embarking on visitations afterward (this tradition is one of the few omissions from ­Gillhammer’s book). At Corpus Christi, in addition to the processions some of us still know, townspeople would dramatize the whole of salvation history in the Mystery Plays.

Gillhammer roams well beyond the strictly liturgical (a concept that would mean rather less to the medieval mind than to the modern). We are introduced to a dialogue between Christ and his mother and poems in which an individual soul talks to Christ. We travel to Paradise with Dante and ­experience the Apocalypse with ­William Blake. In a particularly memorable chapter, we follow the musical motif of death (the opening plainchant line of the Dies Irae) from a Dark Age monastery to the Great Rift Valley, Tatooine, and Mordor, with contributions along the way from Mozart, Mahler, and ­Rachmaninoff. We see how potent and fertile a source of meaning the liturgy can be, even in secular settings. We also see how our sense of death has drifted over the centuries, without ever losing the archetypal sense of dread, which Duffy ­identifies as integral to the human experience.

I came away from the book with a sense, chiefly, of sadness, a feeling best expressed by Matthew Arnold’s “Dover Beach”:

The Sea of Faith
Was once, too, at the full, and      

     round earth’s shore
Lay like the folds of a bright girdle      

     furled.
But now I only hear
Its melancholy, long, withdrawing      

     roar . . .

In six years as a cathedral chorister in the Church of England, first at York Minster then at Wells, I encountered almost all of the liturgy Gillhammer explores, and more. I sang through the book of Psalms several dozen times. To this day, they are written on the heart, and verses come to mind at a moment’s notice. I knew Evensong an­d Matins, with their various canticles and collects, by heart. I sang the Advent Responsory in total darkness and the Te Deum in morning sunlight. I felt the urgency of a medieval Doom in the music of a nineteenth-century Requiem: ­Libera eas de ore leonis! Libera eas de poenis inferni, et de profundo lacu! I saw the Mystery Plays twice, once on wagons as they would have been performed in medieval times, and much later, unforgettably, on a huge banked stage in the nave of York Minster. Aside from the Easter Triduum at the Oxford Oratory, I have never met liturgy in the Catholic Church with the range, variety, or depth of feeling that I grew up with, and that Gillhammer describes.

Feeling is neither an ­unfortunate side effect of liturgy, nor a poor second to correct ecclesiology. A liturgy, or indeed a church, that doesn’t allow room to express the full range of human experience will be abandoned for something that does. Writing of changes to the funeral service after Vatican II, Eamon Duffy remarked:

One of the principal functions of liturgy is to allow us to pray all our thoughts and feelings, to acknowledge before God what we really are, not to suppress and sanitize our innermost selves and only bring to him what is acceptable and theologically correct. The bitter note of protest is surely one of the most basic of human responses to death, and one of the most legitimate…echoed in the cry from the cross, “Why have you forsaken me?” and elsewhere. We need to come to the knowledge that “my redeemer liveth” but we need also to be allowed to rage against the dying of the light. The old liturgy made space for both: the new does not.

In just this vacuum, this refusal to allow room for old terrors and dark feelings, a cult such as Santa Muerte can grow strong. It flourishes on emotional ground that the Church neglects.

The Psalms became the backbone of the medieval liturgy precisely because they met that need. There is a Psalm for every fear, every feast, and every feeling. Christ’s despairing cry from the cross is from the opening verse of Psalm 21/22. The poetry of the Psalms, like all truly great poetry, renders an individual experience universal. We are joined to God in moments of blackest despair: “But as for me, I am a worm, and no man; a very scorn of men, and the outcast of the people” (Coverdale, 22.6); of yearning: “Like as the hart desireth the water-brooks: so longeth my soul after thee, O God” (42.1); of wonder at Creation: “Praise the Lord upon earth: ye dragons, and all deeps/Fire and hail, snow and vapours: wind and storm fulfilling his word” (148.7–8); of exaltation: “Praise him upon the well-tuned cymbals: praise him upon the loud cymbals/Let everything that hath breath: praise the Lord” (150.5–6). Coverdale’s metrical version, with its pronounced central pause, uses the Hebrew two-line structure to echo the sonorous half-lines of Anglo-Saxon verse. More fluid translations draw us into other poetic worlds:

Let ringing timbrels so his honour
      sound
Let sounding cymbals so his glory
       ring
That in their tunes such melody
      be found
As fits the pomp of most
      triumphant king
Conclude: by all that air of life
      enfold
Let high Jehovah highly be
      extolled.
(Mary Sidney, Psalm 150)

So, too, the poetry that made its way into the medieval liturgy, such as the Dies Irae and Stabat Mater, connects us to the inner world of Christ and those who knew him. As Gillhammer notes, “The gospels restrict their laconic narrative to events that can be externally observed. They report what happens and what is spoken as Christ is on the cross, but are mostly silent on the subject of feelings. In every age these blank spaces have been filled by believers’ own imaginations.” No tool is so powerful for this as poetry. At secular funerals, almost all readings are from poetry rather than prose. The medieval liturgy makes full use of both biblical and extrabiblical poetry, in spite of previous Church efforts to exclude the latter (see, for instance, Canon 59 of the Council of Laodicea). Indeed, it makes full use of the whole body, all the senses, the imagination, the power of theater, and almost anything else it can (the major exception being dance). The components of the modern liturgy were only part of a much larger, richer whole.

The Middle Ages would have been very much poorer had the ­Laodicean attitude been allowed to stand. Today, we are much poorer without the Dies Irae, most of the Psalms, the Mystery Plays, the numerous processions that survive in the Orthodox, Eastern Catholic, and sometimes the Anglican Churches, and so on. Medieval Christians loved good ­preaching and could listen to a Dominican friar for far longer than even a modern evangelical will listen to a pastor. But almost all forms of modern engagement and evangelization are ­conducted in prose, or else anchored by it. Talks, books, homilies, reading circles all have their proper place, but not nearly as big or expansive a place as we afford them. I’ve heard thousands of homilies and couldn’t quote a single phrase from any of them (in which, I suspect, I have much in common with your average medieval peasant). The words of ­poems by ­Herbert, Yeats, or Eliot come far more easily. The words of the Psalms, the canticles, or the Requiem are with me always.


Image by Richard Mortel, licensed via Creative Commons. Image cropped.

The post Bright Girdle Furled appeared first on First Things.

]]>
Hegel-Sized  https://firstthings.com/hegel-sized/ Wed, 17 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=103297 A sense of an ending is in the air, but there’s little consensus about what’s ending or why. Progressives worry about the end of democracy, while MAGA conservatives celebrate the sudden implosion...

The post Hegel-Sized  appeared first on First Things.

]]>
The Collapse of Global Liberalism:
And the Emergence of the Post Liberal World Order

by philip pilkington
polity, 237 pages, $22.95

A sense of an ending is in the air, but there’s little consensus about what’s ending or why. Progressives worry about the end of democracy, while MAGA conservatives celebrate the sudden implosion of libtard dominance. European leaders stand aghast at widening fissures in the transatlantic alliance, while Hungary boldly carves out a new path for the continent. Millennials and Gen Zers see the American Dream riding off into the sunset, while the terminally online either fret over the instability of the postwar consensus or gleefully push it further toward the edge.

Economist Philip Pilkington has a name for all this: We are witnessing the “collapse of global liberalism.” Liberalism isn’t dying; it is well and truly dead. Any signs of life are the twitches of a cadaver. I’ll come back to “liberalism,” which does some heavy lifting here and for many other contemporary writers (mea culpa). The great virtue of Pilkington’s book is his attention to the specific ways the world has changed, very recently. There is, perhaps, little new here, but Pilkington moves with easy confidence across levels of analysis, from geopolitics to macroeconomics to birthrates to religion, producing a succinct and largely convincing portrait of what’s happened and what’s ending.


The first thing that’s happened is China. Admitted to the WTO in 2001, it has now surpassed the U.S. and the E.U. in its share of international trade and has become the world’s largest economy. It has also parried Western efforts to curb its growth by cultivating deep economic relationships with and investing billions in Asia, Africa, and South America. ­Domestically, China has adopted and adapted Western economic patterns without becoming Western. Culturally rooted in ancient Confucian conservatism, with a centrally directed economy and an undemocratic polity, it succeeds despite defying the liberal paradigm. As Pilkington writes, Western forecasts of a Chinese crack-up are wishful. China is the real deal: “likely the most stable major economy in the world today,” and a frontal challenge to the way Western policymakers think the world works. China has taught the Global South that economic success doesn’t depend on conforming to Chicago School diktats about austerity, privatization, and free trade. Washington rules no longer rule, as other centers of global power have taken shape. In short, China is the main catalyst for the deterioration of American unipolarity and the emergence of multipolarity, a major theme of Pilkington’s work.

Russia’s invasion of Ukraine ­also happened. After the collapse of the Soviet Union, Russia remained stubbornly Russian, never really tempted to transform itself into a Slavic extension of the West. And since 2022, Russia has survived Western sanctions and ostracism. Here again, China looms large; China is neutral in the Ukraine conflict and has capitalized on Western sanctions by replacing the West as Russia’s main economic partner. Meanwhile, European industry suffers from the loss of Russian energy, and American trustworthiness took a hit when we froze Russian assets. Russia refuses to play by the rules of the global game and holds its own. Like China, it will have imitators, a prospect that justifies ­Pilkington’s judgment that the “Western response to the Russian invasion of Ukraine was one of the largest geostrategic and economic blunders . . . in modern history.”

Changes in military technology are also reshaping the global distribution of power. The Houthis have fought off all comers because they have access to cheap drones capable of damaging or destroying multimillion-dollar airplanes and ships. Trench warfare disappeared during World War II because of tanks and air surveillance, but now tanks can be taken out by drones, and surveillance technologies virtually eliminate the possibility of a surprise blitzkrieg. New wars will look a lot like the old ones, with trenches, fixed lines, and attrition. Victory will go to the nations with the heaviest heavy industry, superior strategy, and largest supply of available soldiers. It’s no longer sensible, if it ever was, to compare military strength by comparing military budgets.

In the West, hyperliberalism happened. With liberal politics largely on “autopilot,” as Pilkington puts it, liberals devoted their energies to pushing liberal ideas into the most intimate private spaces. Falling birthrates, brought on by the liberalizing of sexual practices, portend a host of crises—which the West tries to fend off by importing humans, a form of “biological imperialism.” Hyperliberalism also hardened economic policy, elevating competition into a transcendent good, regardless of the damage it did to manufacturing or wages. Deindustrialization weakened the West and it’s too late to fix it with tariffs. To compete in the global market, American companies went abroad in search of cheap labor, but they stay on because of the skilled workers and the culture of expertise that’s grown up around their factories. Apple isn’t going to shutter its Chinese operations anytime soon, especially after Trump gave them a loophole large enough for a container ship.

Religion happened—or, more accurately, continued to happen despite liberal prophecies about its demise. Christianity never disappeared, and pre-Christian religions have recently made a comeback. Many Western countries have effectively established an environmental worship of Gaia, ending the centuries-­long liberal effort to privatize religious beliefs. The only religions that are withering today are liberal ones, while fresh forms of paganism spring up, turning Western countries green and threatening to wreak havoc on energy-­dependent advanced economies.

Much of The Collapse of Global Liberalism presents granular analysis of this sort, but the book is overlaid with a Big Metahistorical Thesis about liberalism: a Hegel-sized thesis, no less, based on the supposition that “discernible abstract ideas” are the invisible drivers of history. ­Pilkington admits that liberalism comes in soft, hard, and hyper- intensities. Still, throughout its history, liberalism has been a system of ideas and a metaphysics committed to a set of core convictions that issue in a political and cultural project: Society must be rationalized, and,­ ­specifically, ­society can become rational only when hierarchies are demolished and human relationships equalized, organized by consent and contract. Take Adam Smith’s ­commercial society—everyone trucking and ­bartering for ­advantage—apply it to family, church, social relations, ­sexuality, and you’ve got liberal order. ­Pilkington’s definition is ­elastic enough to include Marxism as a form of liberalism, since ­Marxists seek to demolish the economic ­hierarchies that liberal economies generate.

Liberal metaphysics, for Pilkington, also entails an eschatology. Long before Francis Fukuyama, liberals have awaited the final ­triumph of liberalism. Impatient liberals kick in some doors to ensure its elusive victory. At its worst, liberalism becomes a vicious circle, or a selfish meme, which exists only for its own self-perpetuation. This line of argument resembles the thesis of Patrick Deneen’s Why ­Liberalism Failed, the Bible of American postliberalism. In Deneen’s telling, and partly in Pilkington’s, the contemporary West blossomed from liberal ideology like a flower from a seed.

Liberal ideas are, Pilkington claims, directly to blame for many contemporary social crises. Religiously devoted to progress, liberals are more enthusiastic about sheer novelty than about real-world outcomes. Every new technology is better than the last new technology. The DOD is dazzled by shiny new weapons, whether or not they confer any actual military advantage. Pilkington asks the unaskable: Is email really all that much better than a fax? Collapsing birthrates can also be traced to liberal convictions. As Deneen has emphasized, liberalism rests on an anthropology, and kids violate liberal anthropology in every way. Children enter the world without their consent, can’t approve a contract, can’t even stand on their own two feet. Neither can the elderly, which makes euthanasia an attractive option for liberal societies whose younger workers are squeezed by lopsided ­demographics.

Pilkington also indicts liberalism for the crisis of homelessness, addiction, and mental illness. Liberals free people to do whatever they like: We can’t keep them from choosing to do drugs. Confining the insane violates their autonomy. The proof is in the converse: Traditional societies (the Gulf States, for example) affirm hierarchy and authority and severely punish crime, and they boast clean and livable cities. Liberalism licenses individuals to pursue freedom even at the cost of losing their dignity as human beings.

Pilkington’s metahistorical thesis introduces some wobble into his argument. When he is describing the impact of China’s rise or the Ukraine war, “liberalism” refers to the postwar international order, or, even more specifically, post-Soviet unipolar politics. When he is discussing homelessness and the demographic crisis, “liberalism” is an ideology marching steadily through the entire modern ­period. ­Pilkington steadies the wobble by saying that liberalism has only recently reached its practical and conceptual zenith, but the wobble remains.

The definitional wobble is a symptom of a larger problem, the clash between the book’s intellectualized framework and ­Pilkington’s illuminating and very concrete portrayal of what’s happening around us. Liberal order is the embodiment of an idea, but it’s being dismantled through a series of contingent events. Surely, though, liberal ideas were born in the same manner as they are now dying: that is, in the midst of ­unpredictable power struggles, wars, and economic upheavals. Pilkington’s Big Idea tilts idealist, but his account of liberalism’s end is thoroughly realist. At times, the idealism swallows up the details, and it seems that he has catalogued all that’s wrong with the world and laid it at the feet of liberalism.

In fairness, Pilkington didn’t write a history of liberalism, but the imbalance affects both his ­diagnosis and his prescriptions. By intellectualizing, nearly reifying liberalism, he makes liberalism seem more consistently ideological than it is and makes it difficult to sift out the good in it. Indeed, on his account, it’s hard to discern liberalism’s appeal. Human beings are, he says, inherently hierarchical and ritualistic creatures. But if liberalism cuts so directly against the grain of human nature, how did it triumph in the first place? What’s the secret of its staying power? Simplicity, Pilkington says. Simplicity and reductiveness make liberalism dangerous, but “these are the very characteristics that make it so attractive.” But that’s far from adequate to explain a mega-idea that’s gripped the West for half a millennium. On the face of things, it seems much more plausible that liberalism captured something true about the human condition, however distorted it was or has become.

Pilkington’s postliberal prescriptions are charmingly, refreshingly modest. He proposes no grand system of ideas or policies, no twelve-point program. Instead, he calls on common sense, time-tested traditions, and nature. Natural law and teleology will make a comeback in a post-liberal world, but Pilkington’s treatment of these topics is so brief that it’s hard to know what he has in mind. By the time we reach the closing pages, we’ve left Hegel well behind. Not that I’m complaining, but if Hegel is right, we can’t issue liberalism’s death certificate until some other Big Idea comes along to replace it. If Hegel is wrong, then maybe we can’t isolate a single key to modern political history after all.

I think we can resolve these tensions without depending on ­Hegel. Modern political order was a response to the Reformation and post-Reformation wars of religion. To prevent continued religious ­violence, it was argued, rulers had to scrape away all religious trappings. Liberalism uprooted power from its theological grounding, depoliticized the Church, and attempted to organize earthly society without any reference to heaven. Liberalism in Pilkington’s sense—­rationalization, assault on hierarchy—is crucial to the story, but the larger framework is the collapse of medieval Christendom. As I’ve argued elsewhere, liberalism is a post-medieval ecclesiological ­heresy.

Pilkington is right that liberal order is fraying. But the prescription has to be something more internally coherent than his postliberalism. It should be a postliberalism that acknowledges the successes of the liberal centuries: an unprecedented increase in material wealth; the toppling or transformation of decadent aristocracies; freedom and the franchise; the worldwide dominance of liberal nations like Great Britain and the United States. On all these fronts, liberalism’s record is mixed, and it’s arguable that its strengths depend on institutions and convictions that predate liberalism. But on a purely historical basis, we’re right to conclude that there’s something human and humane, something worthy of preservation, in liberalism. Christians should press for a renewed Christendom, one that has learned these valuable lessons of the modern age—not least the lessons that Pilkington so incisively analyzes here.


Image by Miriam Guterland, licensed via Creative Commons. Image cropped.

The post Hegel-Sized  appeared first on First Things.

]]>
B. F. Skinner Is Back https://firstthings.com/b-f-skinner-is-back/ Tue, 16 Sep 2025 05:00:00 +0000 https://firstthings.com/?p=102997 In the summer of 1942, Arthur D. Hyde, vice president in charge of research at General Mills, held what must have been the strangest meeting in his long career...

The post B. F. Skinner Is Back appeared first on First Things.

]]>
In the summer of 1942, Arthur D. Hyde, vice president in charge of research at General Mills, held what must have been the strangest meeting in his long career as an applied scientist. He had been approached by a young man with connections to the U.S. military seeking a research facility for a team of psychologists. They were developing a novel missile-guidance technique that showed promise in early trials. With fighting raging in Europe, and preparations for the campaign in the Pacific almost complete, this was an opportunity for General Mills to contribute directly to the war effort. 

The proposed guidance system, Hyde learned, was not a new piece of machinery but a biological organism trained to act as pilot and bombardier. Burrhus Frederic Skinner, the lead scientist, had demonstrated that pigeons could learn to pick out the visual patterns associated with enemy tanks and planes with remarkable precision. His team had already created a special apparatus to house a pigeon in the nose of a prototype missile. All that was needed to create a potentially war-ending weapon was to find a way to convey the signals from the pigeon to a mechanism that would direct the missile to its target. Concern for the fate of the bird was a peacetime luxury that could not be afforded in an era of total war.

Hyde gave Skinner a laboratory on the upper level of the Gold Medal Flour milling factory in downtown Minneapolis, situated below a sign spelling out the word “Eventually,” an abbreviation of the marketing slogan “Eventually you will use Gold Medal Flour, so why not now?” Had he believed in such things, Skinner would surely have seen in those massive letters a portent or a divine joke. He was then making a name for himself as the leading American proponent of behaviorism, a movement in psychology that emphasized the need for direct empirical analysis of behavior along with skepticism of the explanatory usefulness of mental concepts. For Skinner, notions like “mind,” “­volition,” and “intention” were so many psychic fictions that needed to be expunged before psychology could establish itself as a real science. This made him something of an insurgent at a time when the field was still largely understood as the study of human consciousness. He attracted a small group of zealous adherents but a far larger body of equally determined critics. Despite this, he retained an ­unshakeable faith that reality was on his side and so his position would be vindicated, eventually.


As Skinner found it on leaving Harvard in the mid-1930s, behaviorism was struggling to emerge as a genuine research program. It had been given its name and impetus by the pugnacious John B. Watson, who in 1913 published his manifesto “­Psychology as the Behaviorist Views It,” declaring that psychology needed “introspection as little as do the sciences of chemistry and physics.” Though he admired his unflinching materialism, Skinner felt that Watson had issued a promissory note without the results to back it up. The first true ­demonstration of the possibility of a pure science of behavior, by Skinner’s reckoning, would come a decade later with Ivan Pavlov’s investigations of animal reflexes.

The classic Pavlovian experiments are still widely known today. After gruesome surgical preparation, a dog is restrained and presented with food, and its salivary flow is measured. A “neutral” stimulus such as a metronome beat is then applied each time the food is revealed. Following a number of trials the metronome alone will elicit the same salivary response as the food—evidence of what Pavlov called a “conditioned reflex.”

There is a common misconception that Pavlovian conditioning is, in the memorable words of the psychologist Frank Irwin, all just a matter of “spit and twitches.” Despite his training as a physiologist, Pavlov had no special interest in the functions of the canine salivary system. His real intention was to create a general theory of learning that explained the way behavior is modified by ­experience in both animals and humans. On the basis of his experiments, he conjectured that learning was reducible to the process observed in his dogs, ­whereby associations between different stimulus events are formed and reinforced over time. Between the sounding of the metronome and the flow of saliva, there was no need to ­interpose any “magical explanatory concepts” or “hypothetical mental entities,” to use Skinner’s language.

Pavlov is a star which lights the world, shining down on a vista hitherto unexplored,” wrote H. G. Wells in the New York Times in 1927, read with eager approval by a young Skinner. That light had revealed two surprising facts: first, that learning could be explained in mechanistic terms; second, that the appropriate language with which to describe the mechanism was that of stimulus, response, and reinforcement. Skinner’s own major contribution to the behaviorist movement was to extend this language to include a new form of learning he called “operant conditioning.” The basic operant experiment is described in his first book, The Behavior of Organisms (1938). A pigeon is placed in a box rigged with a lighted button that activates a food dispenser. The box environment, which came to be known as a “Skinner box,” enabled the efficient repetition of experiments with easy control of the relevant variables. Unlike ­Pavlov’s dog, Skinner’s pigeon is left to learn from the consequences of its behavior as it acts on the environment. Through trial and error it discovers the food-dispensing button, and with each peck a food pellet is released with dependable regularity, acting as a “reinforcer.” Thus, over time, the bird settles into a stable pattern of action, guided by the simple principle that reward is contingent on behavior.

It was during the long summer months in the Gold Medal Flour complex that Skinner had what he later called a day of “great illumination.” Hooking down a pigeon from the laboratory rafters, he attempted to teach it to bowl a small wooden ball. In Skinner’s jargon, bowling is a behavior with a complex “topography” involving a chain of actions performed in a specific sequence. The more complex the behavior, the harder it is to design a reward signal to reinforce it. To overcome this problem, Skinner invented a method called “shaping,” in which the behaviors are built up by a process of successive approximations. Skinner’s team started by reinforcing the pigeon every time it moved near the ball. A further reward was added when it made contact. They continually adjusted the schedule of reinforcement, and in a matter of minutes the ball was “­caroming off the walls of the box as if the pigeon had been a champion squash player.”

The team’s members were left staring “at one another in wild surmise.” These remarks have caused some puzzlement among academic psychologists, as Skinner had already performed successful shaping experiments before, most notably with his lab rat Pliny, who was profiled in Life Magazine in 1937. The “illumination” of that day was due to the rapidity with which the bird’s undifferentiated behavior had been molded into complex form with a minimum of stimuli. All that was required to work this miracle was reinforcement learning rigorously applied.

Project Pigeon, as it was known among insiders, was mothballed before the end of the war, leaving Hyde with “a loftful of curiously useless equipment” and a few dozen maladjusted birds. Reflecting on his unsuccessful stint in weapons design, Skinner conceded that the pigeon-guided missile was a “crackpot idea,” but he added that the virtue of such ideas is that they “breed rapidly and their progeny show extraordinary mutations.” With his wartime epiphany still working on his brain, he turned his attention away from pigeons and rats, to the fundamental problem of all psychology: the human mind.

Skinner had previously warned his readers in the last chapter of The Behavior of Organisms that extrapolation from the results of his animal-conditioning experiments to humans was unjustified. But he did allow himself to speculate that the only difference between pigeons and men would lie “in the field of verbal behavior.” In one respect this is not so much a conjecture as the statement of a triviality. After all, humans talk and animals do not. But there is more to the language problem than mere “species-specificity,” as Skinner learned from a chance meeting at Harvard with the philosopher A. N. Whitehead, who challenged him to explain the statement “No black scorpion is falling upon this table” in terms of stimulus and response patterns. Skinner was stumped, and his attempt at an answer satisfied neither himself nor his esteemed guest.

With the end of the war and his return to academic life, he began work on a direct descriptive attack on the language question, which culminated in 1957 with the publication of a major work titled Verbal Behavior. It is a sprawling book, but its central argument can be summarized in a few propositions. The first is that language use is a subclass of behavior, more resembling a pigeon batting a ball than a dog salivating in a Pavlovian experiment. It follows that, like all such behavior, it can be explained in terms of a learning process based on stimulus, response, and reinforcement history. These concepts are defined in the standard way: Stimuli refer to the environmental situation in which behavior is observed, responses are elements of behavior that change in an orderly way over the learning process, and reinforcements are the signals critical to effecting that process. Nothing else is needed and nothing else is admissible without reawakening those “ghosts of dead systems” that Skinner was so keen to exorcise.

What is learned over this process is not knowledge of linguistic structure but “response probabilities,” so that when we receive one set of stimuli we dependably reproduce the expected set of verbal behaviors. Language differs from other forms of behavior only in that all reinforcement comes “through the mediation of other persons,” whose approval acts like the food pellet dispensed to the pigeon. The rest of the book is dedicated to categorizing the different classes of stimuli that drive linguistic behavior. A baroque system of classification emerges, full of strange new terms: “mands,” “tacts,” “echoics,” “intraverbals,” “autoclitics.” What becomes of man under Skinner’s analytical lens has been aptly described by Stephen Winokur: “Man himself has been eliminated as a causal ­variable; he is just a place where causal variables interact to produce talking.”

Initial reviews were largely positive, with one philosopher of language anticipating that the influence of Skinner’s theories would be “deservedly great.” As it turns out, this prediction was right, but for the wrong reasons. Verbal Behavior did cast a light over the whole territory of psychology—only not the even, stellar light that Wells saw in Pavlov, but the flash and flickering afterglow of a comprehensive demolition.

Two years after the publication of Verbal Behavior, a young professor of linguistics at MIT, Noam Chomsky, filed his own review with the official journal of the Linguistic Society of America. It went on to become one of their most popular articles of all time, presenting what Chomsky intended to be the definitive critique of a “futile tendency in modern speculation about language and mind.” His first line of attack was to invert an argument favored by philosophically sophisticated behaviorists such as Skinner’s friend Willard Quine, who had famously observed that we lack reliable identity criteria for mental states, and stipulated that there can be “no entity without identity.” Quite so, Chomsky argued, and on those grounds stimuli and ­responses have no existence, either, for they are every bit as vague, evanescent, and impossible to define as mental states.

To this conceptual analysis he added a second line of attack with a more empirical flavor. Chomsky noted that when children learn languages, they acquire the ability both to produce and to recognize meaningful utterances without digesting thousands of examples. It is as if each child had “constructed” a “grammar for himself,” an internal device that allows him to distinguish sentences from non-sentences. The speed with which this grammar is assembled suggests that it is not entirely learned through trial and error, but must be largely innate. This was the first articulation of what would later be called the “poverty of stimulus” problem: We not only learn the verbal behavior to which we are exposed, but very soon in the language-acquisition process, we can freely create a potentially infinite set of sentences.

This argument flows into the final element of Chomsky’s critique: that language use is creative, in the sense of stimulus-independent. This claim has been variously styled by Chomsky the “Cartesian” or “Galilean” challenge, but it might as well have been called the Whiteheadean challenge, as this is precisely the point the canny old philosopher was making with the sentence he put to Skinner when they met at Harvard: “No black scorpion is falling upon this table.” In what way could an absent object act as a stimulus event? And how could such a sentence, about a non-thing not-falling, ever be an output of a stimulus-response mechanism? It was for this reason, as Chomsky would repeatedly argue over his career, that it is impossible to give a mechanistic explanation of language mastery.

Chomsky’s review landed among the behaviorists like a bombshell. Skinner never responded, leaving it to one of his trusted deputies eventually to post a rejoinder. A measure of its impact is given by computer scientist Joseph Goguen, who, arriving at Harvard as an undergraduate shortly after Chomsky issued his critique, recalled thinking that “the handwriting seemed on the wall for Skinner’s brand of extreme reductionist behaviorism.”

And so it seemed. Some time after the dust had settled on the Verbal Behavior debate, with Chomsky’s own theory of generative grammars now the reigning orthodoxy in linguistics, two mathematicians went for a walk through the streets of Santa Fe one winter afternoon. The younger of the two, Gian Carlo Rota, was a colleague of Chomsky’s at MIT. He had become preoccupied with the problem of artificial intelligence, which was going through one of its periodic resurgences in the universities. His companion, Stanislav Ulam, a leading figure in the Manhattan Project, was less enthusiastic. He predicted that nothing would come of the latest round of funding and research, as previous AI frenzies had failed to push the question much beyond where Descartes had left it in the sections of the Discourse on the theory of automata.

“Your friends in AI,” Ulam opined, “still want to build machines that see by imitating cameras. . . . Such an approach is bound to fail, since it starts out with a logical misunderstanding.” More precisely, it was a misunderstanding of the nature of logic itself, which “formalizes only very few of the processes by which we actually think.” Getting to truly intelligent machines would require an extension of the whole domain of logic that had the potential to undermine its own foundations. The decisive discovery would probably have something to do with the logical basis of analogy, which Ulam elsewhere called an “intuitive faculty” of “transcendental value.” That, at least, was his guess, and Ulam was generally acknowledged to have the most accurate guesses in mathematics.

The two agreed to leave the subject to the nascent coalition forming between their colleagues in the computer science and psychology faculties, but Rota continued to puzzle out the strands of their discussion in his head. He wrote later that he wondered, as he walked home through the melting snow, “whether or when AI will ever crash the barrier of meaning.”

Rota died in 1999, in the depths of the last AI winter, his life cut short by undiagnosed heart disease. Had he been exceptionally long-lived he might have seen the barrier of meaning well and truly crashed in 2022, when the new class of large language models (LLMs) was thrust into the public’s hands for the first time. Quite abruptly the seemingly intractable problem of computer language processing appeared to dissolve into thin air.

Rota had said of Ulam that he resembled an Old Testament prophet with a special channel to God’s intentions, but in this instance his vision failed. Today’s AI revolution did not come by way of a deep insight into the special logic of analogy, but through the resurrection of the older behaviorist paradigm, which Chomsky had seemingly destroyed in the 1950s. To adapt a phrase from the legendary AI researcher Seymour Papert, the new language machines are “behaviorism in computer’s clothing.”

When Papert made this statement in the late 1980s, he was referring to the predecessors of current deep learning models developed by David Rumelhart and the connectionist school. It was calculated to provoke Rumelhart, who had been careful to protect his research from imputations of behaviorism, which was still a term of abuse at the time. In Parallel Distributed Processing (1987), he and his coauthors argued that their approach was “antithetical” to Skinnerian behaviorism, as it was based on how the brain acquires “internal representations,” something Skinner dismissed as nonsense. Taking inspiration from contemporary neuroscience, their models were assembled from large arrangements of interacting neuron-like computational units. But as Papert noted, the ­biological metaphor, though suggestive, conceals a process that is reducible to terms more or less identical to those of classical behaviorism:

Learning takes place by a process that adjusts the weights (strengths of connections) between the units; when the weights are different, activation patterns produced by a given input will be different, and finally, the output (response) to an input (stimulus) will change. This feature gives machines . . . a ­biological flavour that appeals strongly to the spirit of our times and yet takes very little away from the behaviourist simplicity: although one has to refer to the neuronlike structure in order to build the machine, one thinks only in terms of stimulus, response, and a feedback signal to operate it.

With typical farsightedness, Papert added that a system that was able to “learn whatever is learnable with no innate disposition to acquire particular behaviours” would in itself constitute a “vindication of behaviourism.” Such a machine would demonstrate that stimulus-conditioning is, as Skinner had predicted more than fifty years ago, sufficiently rich to explain language use and just about everything else in the field of behavior.

If the older generation of connectionists found comparisons with Skinner distasteful, current researchers are far happier to own their behaviorist inheritance. Indeed, behaviorism appears to have crashed its Chomsky barrier. In their massive textbook on reinforcement learning, Richard Sutton and Andrew Barto, recent recipients of the Turing Award (often called the Nobel Prize of computer science), discuss how Skinner’s shaping techniques provided the fundamental insight for dealing with the problem of sparse reward, in which the complexity of the desired behavior makes it difficult to design a suitable signal to guide the AI. This is exactly the problem that Skinner faced during Project Pigeon, and which he solved by his method of successive approximation.

One would almost be tempted to call LLMs “Skinner machines,” were it not for the fact that Skinner himself was generally skeptical of the value of computer systems as a research tool. When one of his most gifted graduate students left the behavioral laboratory at Indiana University to work on the theory of computer learning, Skinner naturally assumed that his brain had been damaged by wartime trauma. If not a Skinner machine, then we can follow Geoffrey Hinton, one of the most prominent AI researchers alive today, who has described modern deep-learning models as a virtual system of Skinner boxes.

For his part Chomsky, who is exceptionally long-lived, has conceded that an AI that could do all the things Papert described would partially vindicate the behaviorists, but he contests that this is an appropriate characterization of LLMs. He has doggedly maintained all the elements of his earlier anti-behaviorist critique—though whereas previously they seemed to constitute a knock-down argument, today they have lost much of their potency.

For example, the Cartesian challenge—that language cannot be produced mechanistically—must be substantially revised, and in such a way as to create suspicions of special pleading, given the startling effectiveness of LLMs at generating novel and coherent sentences. As for the inconsistencies Chomsky found in the basic concepts of stimulus, response, and reinforcement, those criticisms can now be dealt with in the way Frank Ramsey once settled Wittgenstein’s concerns about the logical coherence of mathematics: Suppose a contradiction were to be found in the axioms of set theory, do you seriously believe that a bridge would fall down? The language machines, like bridges, are solid facts.

But the “poverty of stimulus” argument has, if anything, been strengthened. All of the models used today require biologically implausible amounts of power and data to be trained. Whatever children do when they learn language, they clearly do not ingest the entirety of all written script stored in the internet. Nor do they consult a vast archive of pre-labelled and carefully curated examples. Evolutionary explanations of how an LLM-style machine could appear in the biological world are likewise non-starters. There is something in the way our intelligence operates that suggests not a vast statistical exercise performed over billions of parameters, but a series of singular and sudden apprehensions. We do not average reality; we grasp it. For those sympathetic to Chomsky, as I am, artificial intelligence is just one more evidence of that wholly mysterious feature of the human intellect: its creativity.

But that is, increasingly, a minority position. Among Skinnerians, language has always been viewed as the last stronghold of “mind.” Artificial intelligence has all but brought that stronghold down, and in doing so has created the conditions for a renaissance in behaviorist thinking. For those who believe in things like mind, ideas, intentions, the good, the true, the beautiful, let this be a warning.

When John Watson launched the behaviorist project, he declared that its final ambition was the “prediction and control of behavior.” Though there is enough variety in behaviorism to frustrate easy generalization, it all finally converges on a picture of the human person as an automaton that can be shaped as readily as a pigeon.

Skinner was one of the few behaviorists who tried to imagine what society would look like if it were designed on operant principles. He did so in two books, the utopian novel Walden Two (1948) and Beyond Freedom and Dignity (1971). Reading them today, one is shocked by their lunatic consistency and terrible emptiness. They also provide surprising insight into our own social arrangements.

For can we honestly deny today that human behavior is to a large degree manipulable? In the West, governments are already experimenting with “nudge units” that design policies on the basis of behavioristic models. In less democratic systems, most famously in China, far grander efforts in social engineering are already underway. But for the modern consumer, the largest flow of conditioning stimuli comes through the large technology companies, which constantly collect data along thousands of dimensions, the better to model and shape the behavior of their users. Stare long enough at a bus full of people tapping away at their phones, and they will quickly come to resemble pigeons in boxes.

Indeed, more and more our society is becoming a sort of box populi, as one of Skinner’s friends memorably described Walden Two, a book that has suffered from being more debated than read. Today it is probably best known for its influence on Anthony Burgess’s A Clockwork Orange and Stanley Kubrick’s film adaptation of that novel. But readers who go to the book expecting to find humans walled up in vast Skinner boxes, half-starved and intermittently electrocuted, will be disappointed. The community at the new Walden pond looks more like a branch of the Amish than like Burgess’s malign conditioners. Skinner tells us plainly that his social program was intended to replace the anarchic conditioning schedules already imposed on humans as consumers and technology users, with a plan better designed to match the needs of the “human organism,” to use his preferred language.

I anticipate that it will become more attractive as the anarchy intensifies. But we must always remind ourselves of the cost, which was worked out by C. S. Lewis in a series of lectures delivered at the University of Durham in 1943, around the time Skinner was exploring the most efficient way to pack pigeons into missiles:

However far they go back, or down, [the conditioners] can find no ground to stand on. . . . It is not that they are bad men. They are not men at all . . . they are artefacts. Man’s final conquest has proved to be the abolition of Man.


Image by Biglicks12 licensed via Creative Commons. Image Cropped.

The post B. F. Skinner Is Back appeared first on First Things.

]]>