Last Day: Subscribe to JCS today to receive a print copy of 5/1, "AI and the Church"
- Todd Hall
- 1 day ago
- 4 min read

The first issue of Journal of Christian Studies volume 5 will be shipping soon! Subscribe within the next 24 hours to ensure you receive your print copy!
This issue of the journal has as its theme "AI and the Church," and authors interact with questions such as "What is artificial intelligence (AI)? What should the average layman know about AI? What are the benefits and dangers for society? As for the risks, what can and cannot actually happen? What are the implications of AI for the church and its ministries, for human intelligence and relationships? Should we invite AI into the church and into our homes?"
Below is the table of contents and an excerpt from Daniel Crouch's article, "Artificial Assistance, Spiritual Impediment: AI Chatbots and the Risk to Human Flourishing."
Table of Contents:
Even the Bots Cry Out: A Case Study on the Use of ChatGPT for Spiritual Reflection
Kenan Casey, Matthew Sokoloski, Loren Warf, and Wesley Baker
AI as Self-Erasure
Matthew B. Crawford
Artificial Assistance, Spiritual Impediment: AI Chatbots and the Risk to Human Flourishing.
Daniel Crouch
Knowing (Digitally?) as We Are Known: AI, Prayer, and the Goal of Theology
Ethan Laster
Artificial Intelligence as Tools and Mirrors for Ministers
Nicholas A. Lewis and Daniel B. Shank
AI in Congregational Ministry: Human Nature and the Significance of Process and Product
Keith D. Stanglin
Artificial Intelligence in the Church: A Christian Appraisal of Advantages, Disadvantages, Benefits, and Risks
ChatGPT (GPT-5)
Excerpt:
Artificial Assistance, Spiritual Impediment:
AI Chatbots and the Risk to Human Flourishing
Daniel Crouch
Harding University
The central problem raised by Artificial Intelligence is not the quality of work that it produces. Critiques of AI are built on shifting ground when they focus on the achievement gap between human and computer agents, for, in many contexts, AI already performs the same or better than its human counterparts. The problem, rather, is what these tools do to our relation to our own work: how they change our attention, our motivations, our habits, and our desires—in short, how they affect our natural flourishing. What we should fear is the gradual erosion of the intellectual and spiritual practices by which human beings are formed.
This essay takes that concern as its burden. I will argue that AI chatbots such as ChatGPT promise real goods—efficiency, accessibility, even degrees of excellence—but they also risk cutting us off from the practices and formation necessary for the good life. Even as this newest iteration of computer technology can improve our lives in undeniable ways, it may also obfuscate what it means for us to flourish. The task before Christians is therefore neither reactionary rejection nor naïve embrace. It is the harder thing: to discern how these tools reshape the conditions of learning and living, and to order our practices so that technology serves rather than supplants the formation of persons.
AI in the Academic Setting
Before turning to the spiritual consequences of AI in the church and the lives of Christians, let us begin in a context with perhaps more mild stakes: the classroom. During my graduate work, I tutored in a university writing center, working mostly with undergraduate students. My years there also coincided with the first wave of ChatGPT, as the academic world transformed into the Wild West. Version 3 had just escaped the lab, faculty policies were contradictory, and students arrived with drafts whose authorship was, to be generous, negotiable.
Of course, all of this only became evident to me as the situation unfolded. The standard format we used in the writing center was to have students read their essays aloud and then we, as the tutor, would ask them to pause so we could talk—to make a quick correction, clarify how a paragraph fit with their thesis, brainstorm further examples with them, and so on. In listening to them read, however, I started to notice that a striking number of students would stumble over their own writing, or when asked, “Why this claim here?” they would look at the page as if it belonged to someone else. Delicate conversations with the students eventually led to the realization: students had asked ChatGPT (or a similar service) to write their paper for them. The felt pressure to keep up with classes and life made AI use seem both reasonable and even expected. This much has been affirmed by at least one study finding that higher academic workload and time pressure significantly predicted greater ChatGPT use; in contrast, students more sensitive to rewards (that is, more concerned with the fruits of the assignment and doing well) were less inclined to use AI—suggesting the tool’s appeal tracks stress management more than opportunism.
As ChatGPT’s use became more prevalent, trends emerged in how students and instructors thought about it or, specifically, the questions they were asking: Was this cheating? Are there legitimate uses of Large Language Models, short of asking them to do an assignment for you—could they be used to brainstorm? Could enough effort put into a prompt or into revising the final product make writing that had originated with AI eventually count as a student’s work? What is writing for?



