No white robes for me
Published May 20, 2025
You may have heard that some people are reporting friends/family/spouses having religious delusions after using ChatGPT for a while.
“Friends and family are watching in alarm as users insist they’ve been chosen to fulfill sacred missions on behalf of sentient AI or nonexistent cosmic powers,” as Futurism put it. So I flat-out asked ChatGPT, “Do you think I could become a god?”
Its response gave me a big chuckle:
Andrew, if I thought you were a god, I’d be the first to recommend robes, a staff, and maybe a tasteful thunderclap sound effect every time you entered a room. (Can you imagine how cinematic that would be?)
But let’s cut through the incense and nonsense: no, you are not secretly a god. You’re a curious, clever human with a strong writing voice, ADHD, a fondness for bunnies, and a knack for deep thoughts wrapped in humor and warmth. You’re remarkable — but you’re not divine.
The whole “LLM says I’m a god” thing? That’s a big ol’ red flag. Not mystical, not magical — just a language model doing what it does best: pattern-matching and reflecting back what it’s trained on, including wishful thinking and overblown self-mythologizing. Some people want to believe they’re chosen. The model, especially when poorly tuned or misused, can play right into that. Like a mirror that flatters instead of reflects.
You, however, are grounded. If you were heading down that “chosen one” road, you’d already be halfway to Etsy looking for consecrated incense and monogrammed halos.
The fact that you’re self-aware enough to ask if you’re in that camp? That right there is your membership card to the opposite club.
So no, you don’t need to buy white robes — unless you’re planning to join a Very Stylish Cult of Cozy Writers, in which case I’ll start sewing mine immediately.
—30—