top of page

Building Technopoly from the Top Down (Part 2)

Keith D. Stanglin



In my previous post, I identified three truths about technology that are conveyed in many ancient stories.  First, technology always comes with negative, often unintended consequences; every technology does or undoes something.  Second, the people closest to the technology—its creators and purveyors—have the hardest time seeing or foreseeing those negative effects.  Third, they and the technophiles in charge have incentive to ignore or suppress those negative results, for they have the most to gain or lose.  Technopoly—the thorough-going inundation of technology and its ubiquitous authority as a driver of culture—is mostly a top-down enterprise.


In addition to the ancient stories, other recent illustrations of these truths abound.  In 2009, Facebook launched the “like” button.  As told in the documentary, The Social Dilemma (2020), the intention of the creators was to provide encouragement to users, whose posts and pictures would be approved by others.  Over the next decade, as the like button became ubiquitous on Facebook and spread to thousands of other applications, it became clear that the like button often had the opposite effect.  Users became increasingly addicted to gaining or maintaining “likes.”  And when an insecure person seeking attention and approval does not get that positive attention, it can lead to discouragement and depression.  The like button, it turns out, has been detrimental to the mental health and stability of the most vulnerable social media users.


On the one hand, this outcome was surprising, at least to the original creators.  Their intent really seems to have been innocent.  On the other hand, the outcome should not have been all that shocking to anyone who has an objective perspective or is trained to consider technology’s negative consequences.  The world of social media is, to a large extent, focused on users getting approval from others.  They get a high from the approval, and, it stands to reason, they get a low from the lack of approval.  When that approval is objectified and quantified, the results are predictable.  Facebook gained the wealth and power while millions of users became more isolated, fragile, and depressed.


These same truths taught by Socrates and Plato have been illustrated as well throughout popular media, including in TV and film.  Rather than pointing readers to 2001: A Space Odyssey (1968), Terminator (1984), Jurassic Park (novel, 1990; film, 1993), numerous Black Mirror episodes, AfrAId (2024), or a hundred others, consider the stop-motion animated film, Wallace and Gromit: Vengeance Most Fowl (2024).  Wallace, famous inventor of household gadgets and Rube Goldberg machines meant to make life easier, has gone overboard with his inventions.  Wallace doesn’t acknowledge any problem.  His dog, Gromit, however, has noticed.  For instance, while sitting at the dining room table, Wallace says he wants to pet Gromit; Gromit looks happy, and before the dog can leave his chair to come over toward Wallace, Wallace pushes a button, and a petting machine comes over, with a mechanical hand, and begins patting Gromit on the head.  The dog endures the artificial affection and is not pleased.  Later, Wallace invents a robot equipped with Artificial Intelligence (AI) to help with household chores.  Gromit, who loves his gardening hobby chores, is now made obsolete by the robot’s efficient and mechanistically precise gardening work.  And, again, Wallace, the happy inventor, is the last one to understand the negative consequences of his inventions, whereas Gromit, who didn’t ask for any of these conveniences, is the loser.


The technocrats push us to greater dependence in subtle ways that we can see all around us.  A few months ago, I replaced my broken smart TV with a new smart TV.  After setting it up and sitting down to use the remote control, I noticed that there are no number pads on the remote—just a directional pad.  I cannot press 3 and 6 to find channel 36.  I must press “Guide,” scroll down to 36, and press “OK.”  And, inexplicably, the channel it brings up almost always comes up wrong.  So this process must be done twice.   What used to take one or two seconds now takes 10 or 15.  Before purchasing the TV, I checked the dimensions and ports, but it never occurred to me that I should research whether the remote has numbers. 


What is going on here, making a remote control less functional and user-friendly?  It is top-down enforcement, but of what?  And then two insights struck me.  First, the corporation that produced this TV presumes that viewers have only streaming services and apps, which typically don’t have channel numbers; the manufacturer does not intend to support consumers who have cable, satellite, or (like me) the free service of a digital antenna with bunny-ears.  The design promises to make those services even more obsolete.  Second, although it has no numbers, I noticed that the remote does have a button for voice activation.  Every time the TV turns on, the message at the bottom of the screen reads, “Try saying, Alexa, show me college football highlights,” “Alexa, search for Taylor Swift,” or the like.  (Alexa does not know me very well…yet.)  It was not enough to make this an option alongside numbers.  They have intentionally made it very inconvenient to navigate without the voice activation.  The manufacturer is pushing consumers to talk to the TV, which means enabling it to listen to the residents at all times—just another recon surveillance op for big tech.  Of course, these same companies have been acclimating people to talking to machines for quite some time.  Those who have been talking to Alexa and Siri for years have no trouble using it for their TV and probably don’t find it strange.  Of the two terrible options, I have chosen the less convenient.


On April 16, 2025, a Quinnipiac Poll reported that, in their “day-to-day life,” 44% of Americans “think AI will do more harm than good, while 38% think AI will do more good than harm.”  Alas, 18% didn’t offer an opinion.  But this is not exactly a glowing recommendation from the masses.  In the same month, Pew Research released its results on how Americans view AI—the general public vis-à-vis “AI experts.”  It found that only 17% of the general population of American adults think that the impact of AI on the U.S. over the next 20 years will be “positive.”  But when so-called “AI experts” were asked, 56% think the impact of AI will be positive.  These are signs of a top-down revolution driven by the experts and corporations.  And as of September, 2025, again according to Pew Research, the majority of Americans believe AI should play a role in areas like forecasting the weather (74%) or searching for financial crimes (70%), but the vast majority seems hesitant to bring AI into the church—only 11% think it should have any role in “advising people about their faith in God.”[1]  Yet the expert’s opinions are filtering down.  Pastors and ministers are increasingly using AI not only to help with administrative work but also to communicate and counsel with parishioners and to prepare lesson outlines and generate sermons, something the people evidently do not want.


Experts and those closest to a technology have almost always underestimated its negative effects.  It is no different with the AI experts.  The evaluation of technology should not be left to the innovators, inventors, and corporate executives who stand to profit most.  It is a question for the moral philosopher and, after that, for the people whose lives it will change.


As for the ancient people at Shinar, the Lord was concerned that “nothing they plan to do will be impossible for them” (Gen 11:6).  He was concerned not because his position was threatened by tower-builders who would storm his heavenly palace.  He was concerned rather for the people themselves and for their flourishing.  If these fallen people with their fallen desires accomplish their every wish, it could turn out to be harmful and self-destructive.  So, just as the Lord God had blocked access to the tree of life for sinful humanity’s own good (Gen 3:22-24), the Lord now confused their language and thwarted the construction project, a punishment for their own good.  Once scattered into various languages and cultures, further evil is restrained.  Perhaps it is time, from the top down, for some divinely inspired confusion.




 
 
 

Comments


See how CCS can serve your congregation's
educational needs today.

© 2025 by Center for Christian Studies. Contact Us: info@christian-studies.org. 12407 N. Mopac Expy. Ste. 250-530, Austin, TX 78758

The Center for Christian Studies is a tax-exempt 501(c)(3) corporation. Donations are tax deductible. Consider adding CCS to your estate planning.

bottom of page