5 Years of the Journal of Christian Studies
- Todd Hall

- 4 days ago
- 5 min read
The Center for Christian Studies will celebrate its fifth anniversary in 2026, if the Lord wills. We remain astonished at God's providence, especially when we recollect what we started with—the promise of video courses, curricula, and a journal. We did have some books on hand, but little else, apart from a calling to serve the church and an excellent board (including Libby Weed, now at rest from her labors).
Our first offering fully produced by CCS was the first issue of the Journal of Christian Studies, which you can read gratis on our back issues page. The issue was on "The Church and the Pandemic," and like many issues of JCS, not only was it helpful for the particular context in which it was written (the emergence from the COVID crisis), but it also transcended a particular moment in time, dealing with themes such as the importance of worship as a community, pastoral care in times of sickness, how churches handle emergency situations, and the degradation of the discipline of church attendance, among others. Since that first issue, JCS has had contributions from excellent scholars and writers such as Carl Trueman, Mary Eberstadt, Gilbert Meilaender, Brad East, Ben Witherington III, Ephraim Radner and many more.
We're pleased, then, to begin this fifth year of JCS with this volume on Artificial Intelligence and the Church. Like previous issues of the journal, 5/1 speaks to a particular moment in time: the challenges and potential crises to Christian discipleship in the AI era. As in the other issues, though, JCS 5/1 transcends any particular moment insofar as it is a collection of reflections on Christian faithfulness, and presents timeless truths about discipleship.
The Journal of Christian Studies is grateful to have Daniel B. Shank, Associate Professor of Psychological Science at the Missouri University of Science and Technology, as a guest co-editor for this issue. Daniel has no doubt saved us from many misunderstandings about AI, but he has also provided direction and theological insight to this issue. Below we are including Daniel's Guest Editor's Note.
We are grateful to be going into our fifth year of publication. We are grateful to our many guest authors, to our subscribers and faithful supporters, to our tireless board, to the church, and above all to our Lord for his providence. May CCS faithfully serve the church for years to come.
If you haven't yet subscribed to JCS, be sure to visit this page soon so you don't miss this important issue!
Consider becoming a CCS supporter in our fifth year! Donate here.
Guest Editor's Note
Artificial intelligence is a moving target, both as a term and as a discipline. In graduate school, one definition of AI was “what machines and computers do in the movies,” suggesting a goal always outside of reach, and another was “getting machines to act or think like humans,” suggesting humans as the ultimate benchmark. Likewise, an algorithmic solution was classified as artificial intelligence until it was implemented, in which case it was just computer programming. Culturally, the term artificial intelligence has morphed and taken on a life of its own. Now everything seems to include AI.
There is some truth to that: AI techniques of various types are used in all types of applications. We might not consider autofill word recommendations, GPS directions, and social media feeds as AI, but each uses some AI techniques. In fact, they have each been trained in part on real data from their respective domains. This training, or machine learning, represents the broad umbrella of modern AI techniques called connectionist AI, where AIs learn from data. Another branch of AI, less used today, is symbolic AI, where the rules are all pre-programmed.
We most clearly see a program labeled as AI when it uses machine learning-based generative or language models. For example, doing a web search produces results based on complex algorithms that use machine learning, but many search engines report their results without label or explanation. Yet newer search modes include “AI Overviews,” which present a summary that itself is a natural language amalgamation and summary of content pulled from websites.
Therefore, we see the term AI applied broadly to large-language models, or LLMs, which use machine learning through deep learning artificial neural networks—among other techniques. These models are trained on massive amounts of data and can learn relationships and context among those data. For language, this includes not only grammar and definitions, but words in the context of a sentence, paragraph, document, or conversation.
Once models learning is sophisticated enough, it can be applied two ways: one, to model the meaning of other’s language in a conversation or a document, and two, to be able to write a response, summary, or other document based on this model of meaning. These two converse ways represent uses for LLMs as programs that can both sensibly respond and create unique content. The generated content is unique in that the AI puts it together in a not-previously-seen way, but it is still confined to building on concepts included in its training or prompted by its interaction. In this way, its new ideas are weighed toward human data that it is trained on. AI writing tends to be clear because it’s an average of different types of human writing. But for the same reason it tends to not have the flair, quirks, or ingenuity one might expect of a particular writer.
While LLMs have language in their name, the LLM technique can apply to other sophisticated data systems like programming languages, images and art, and speech. The term generative AI refers to when it is used for creating specific content in these different media. Because these have become popular for their wide uses, especially since ChatGPT (GPT-3.5) in late 2022, people are usually thinking about these when referring to AI. Their general nature is also putting them closer to artificial general intelligence (AGI), as opposed to narrow AI systems—ones that only perform a single kind of task.
Of different forms of AI, LLMs and generative AI also probably have the largest potential impact on the church. Because they are widely used for creating content, conversing, and summarizing, they have potential to discuss or summarize what Scripture, commentators, or preachers have written or said. They can dig into specific issues or questions, and even generate ideas or point to other resources related to theology and its presentation in teaching and preaching. Similarly, they could help find or create resources to augment ministry from generating worship songs to producing children’s Vacation Bible School curriculum. Likewise, they could be used to draft personal responses to someone’s illness, a missionary’s report, or biblical advice on a specific life situation.
To a lesser extent, AI might be used in more peripheral or secondary roles for a ministry. For example, as an organization, the church has logistical and business issues from scheduling, hiring, payment, formatting documents, and ordering supplies. While most churches probably aren’t using data analytics, some may use AI-enabled systems, designed for churches specifically or organizations more generally, to streamline these tasks.
AI is clearly a force in our society and world and is changing people’s relationships to knowledge, work, interaction, and production. It is important to examine how it can influence the church now and, as it develops, how it might in the years to come. This issue of the Journal of Christian Studies helps lay that foundation and establish those critical conversations.
Daniel B. Shank
Guest Editor







Comments