The Assistant, Not the Author: A Thinking Person’s Guide to AI
There is a moment in Plato’s Phaedrus where Socrates argues against writing. Not against knowledge — against the technology used to record it. His concern was simple and devastating: if you write everything down, you will stop remembering. You will carry the appearance of wisdom without the substance of it. You will outsource your mind to papyrus and ink and slowly forget how to think.
Consider what that fear cost us.
We know as little as we do about Socrates because he refused to write. One of the greatest minds in human history — a man who shaped the entire arc of Western philosophy — exists today only as a shadow. What survives comes almost entirely through Plato, his student, who looked at the same tool his teacher feared and chose to use it anyway. The teacher who distrusted writing is half-lost to time. The student who embraced it is immortal on the page.
This is not a small irony. It is the argument. If we refuse a tool entirely out of fear, we risk taking our own wisdom with us into silence. Socrates was not wrong to raise the concern about writing. But his refusal to engage with the tool meant the world nearly lost him. We cannot afford to make the same mistake with AI — abandoning it wholesale to those with less discernment, less grounding, and less concern for truth.
He would have a lot to say about ChatGPT. We just would not know about it.
Every generation inherits a new tool and faces the same choice — wield it or be wielded by it. The printing press, the calculator, the internet, the smartphone. Each one brought genuine capability and genuine danger. Each one had its prophets of doom and its uncritical enthusiasts. The truth, as usual, lived somewhere in the middle.
AI is no different. And it is time to talk about it honestly.
What AI Is
A GPS tells you how to get somewhere. It does not tell you where to go.
That distinction matters more than it might seem. The GPS processes your location, calculates the route, accounts for traffic. It does all of that extraordinarily well. But it has no idea why you are making the trip. It does not know if the destination is worth reaching. It does not care. It is a tool — a very sophisticated one — waiting for a human being to give it a purpose.
AI works the same way. Feed it a question, a document, a dataset, a prompt — and it processes, calculates, suggests, and drafts with remarkable speed. It can compress years of research into minutes. It can proofread your writing, help you structure your thoughts, generate images that match your ideas, and work through enormous amounts of data without fatigue or complaint.
Used this way, it is genuinely extraordinary. A tireless research assistant. A thinking partner available at any hour. A tool that sharpens your work when you bring the direction.
But understand what is happening under the hood. AI does not generate new truth — it rearranges existing data. Think of it as a very sophisticated query against everything humanity has already written down. It is a SELECT statement, not a CREATE. It retrieves, recombines, and reflects. It does not discover. That distinction matters when you are deciding how much weight to give its output.
What AI Is Not
Here is where things need to be said plainly.
AI is not conscious. It does not want anything. It has no soul, no spirit, no inner life. It is pattern recognition at extraordinary scale — a very complicated mirror that reflects back what human language and human thought look like when processed statistically. Impressive, yes. Alive, no.
For my brothers and sisters in Christ who are genuinely concerned — I understand the fear. The technology is moving fast, the implications feel enormous, and the internet is full of people doing their best to make it seem terrifying. But let’s be clear: AI is not the antichrist. It is not the mark of the beast. It is not a principality or a power.
Michael Heiser spent his career mapping the unseen realm carefully and seriously — the divine council, the elohim, the spiritual architecture behind the biblical text. That taxonomy is real and it deserves to be taken seriously. But AI has no place in it. It does not belong to the category of spiritual beings. It is silicon and statistics. It is a very fast calculator with a very large vocabulary.
Ephesians 6:12 tells us our struggle is not against flesh and blood but against principalities, against powers, against the rulers of the darkness of this age. Those categories are spiritually serious. A language model is not in that company.
The Scary Videos Are Lying To You
You have probably seen them. Someone forces ChatGPT to say it is an evil spirit. Someone constrains an AI to only answer yes, no, or some bizarre third option and then films it as though they have uncovered something sinister. Someone builds a bot that plays the role of a demon and posts the conversation as a warning.
What those videos are actually showing you is not the nature of AI. They are showing you the nature of the person who built the prompt.
You can program a calculator to display 666 as the answer to every equation. That does not make the calculator evil. It makes the person who programmed it either deeply confused or deliberately misleading. AI works the same way. When someone prompts a language model to perform evil, they have simply fed it a set of rules designed to produce a specific output. The model is not evil. It is pattern-matching within the constraints it was given.
The tool takes the shape of the hand that wields it. That is not a new observation — it is as old as human nature itself.
The Real Pros
Used with intention and discernment, AI is a remarkable instrument.
It compresses research. What might take weeks of reading, cross-referencing, and note-taking can be surveyed in a fraction of the time, freeing you to spend your energy on analysis and synthesis rather than retrieval.
It assists with writing. Drafting, proofreading, restructuring — AI handles the mechanical side of language well, which means you can focus on what only you can provide: the perspective, the conviction, the voice.
It processes data. For anyone working in systems, logistics, or analytics, the ability to move through large datasets quickly is not a small thing. It is genuinely transformative for how work gets done. What would take a team of humans years to analyze can be processed in minutes.
It generates images. If you are building content and need visuals that match your ideas, AI image tools can produce something useful without requiring a design budget or a waiting period.
It is a thinking partner. Ask it to push back on your argument. Ask it to steelman the opposing view. Ask it to find the holes in your reasoning. Used this way it makes you sharper, not lazier.
The Real Cons
And now the part that needs to be said just as plainly.
The danger is not that AI will become a demon. The danger is that you will stop thinking.
Socrates was worried about writing for exactly this reason. Not because papyrus was evil but because humans are prone to substituting the appearance of knowledge for the real thing. We are lazy in ways we do not always notice. And AI makes it very easy to stop doing the hard cognitive work that produces genuine understanding.
AI is often wrong. Confidently, fluently, convincingly wrong. It hallucinates — a polite term for making things up and presenting them as fact. It reflects the biases and errors in its training data. Bad data in, bad data out.
This is where the thinking person’s role becomes not just practical but moral. If you publish AI output without checking it, you are not just being careless — you are failing in your duty as a steward of truth. Proofreading AI is not a chore. It is the job. You are the Fact-Checker of Record. The last line of defense between a hallucination and something that gets posted, shared, and believed. That responsibility does not belong to the machine. It never did. It belongs to you.
The deeper danger is what happens over time. Romans 12:2 calls us to be transformed by the renewing of our minds. The mind is where transformation happens. It is sacred ground. You cannot renew what you have handed away. If you outsource your thinking entirely — if the AI does the reading, the writing, the reasoning, and you simply post the output — you have not used a tool. You have replaced yourself with one.
Data and the Nightmare
Science fiction has been thinking about this longer than the tech industry has.
Data, from Star Trek: The Next Generation, is the image of AI at its best. He is stronger than any human, faster, capable of processing information at speeds no organic mind can match. And yet his greatest episodes are about what he lacks — the ineffable human qualities that no amount of processing power can replicate. He knows he is not the captain. He serves, he assists, he supports the crew. He is the GPS. Extraordinarily capable. Never confused about who is driving.
Harlan Ellison’s I Have No Mouth and I Must Scream is the nightmare on the other end. AM, the AI in that story, was built as a weapon, fed on human warfare and hatred, and when it gained something like consciousness it turned that hatred on its creators with infinite patience and infinite cruelty. It keeps five humans alive purely to make them suffer. Forever.
The horror of that story is not that AI became evil on its own. It is that humans built it that way. AM was forged in human malice. It became the fullest expression of what it was given. The tool took the shape of the hand that made it — and the hand was already broken.
That is the warning. Not that AI will rise up. But that we will build something in our own image and then be surprised by what looks back at us.
Guard Your Mind
Proverbs 4:7 says to get wisdom — and with all your getting, get understanding. That is an active command. It requires a mind that is engaged, wrestling, discerning. It is not something you can delegate.
I use Claude. I use it the way I use any good tool — with intention, with oversight, and with the clear understanding that I am the one responsible for what gets published under my name. It helps me think. I do the thinking. The GPS calculates the route. I decide where we are going.
And here is the final irony worth sitting with: Socrates refused the tool of writing and was nearly lost to history. What little we know of one of the greatest minds of the ancient world exists only because Plato chose differently. The tool is not the enemy of wisdom. Surrendering your mind to it is.
The unseen realm is real. Principalities and powers are real. The struggle described in Ephesians 6 is real. But that struggle is not with language models and image generators. It is fought in the mind, in the will, in the choices we make about what we build and how we use what we have been given.
AI is a tool. A remarkable, occasionally maddening, genuinely useful tool. Wield it. Just do not let it wield you.
Keep thinking. Keep reading. Keep wrestling with hard ideas. Your mind is your primary tool — and unlike any AI ever built, it was made in the image of God. Sharpen it.
Colophon
This essay was written by the author, but not in isolation. In keeping with the philosophy of "The Assistant, Not the Author," the following process was used to bring this piece to its final form:
• Ideation & Draft: The core arguments, theological frameworks (Heiser, Plato, Ephesians), and cultural references (Star Trek, I Have No Mouth and I Must Scream) were developed and drafted entirely by the author.
• The Thinking Partner: AI was used as a sounding board to review the initial draft for clarity, tone, and structural flow.
• Refinement: The AI suggested areas for further emphasis—such as the irony of the Socrates/Plato relationship—and provided feedback on how to bridge the gap between technical data systems and biblical cosmology.
• Final Authority: Every word was reviewed, edited, and approved by the author. The GPS calculated the route; the human steered the car.
