The Self as Prompt
On AI, Rumination and the Limits of Insight
The Self as Prompt
I know my sun sign, moon sign, rising sign (Scorpio, hehe). My personality type (E/I NJT). My love languages. My attachment style(s). My sleep score. I know what makes me happy. I know my triggers. I know the story I tell myself about why I am the way I am.
It’s no secret we’re living through an era of extreme individualism. Not the rugged, frontier kind, but a deeply introspective version: the self as project, the self as puzzle, the self folded and unfolded through different frameworks, like origami.
Where religious systems once oriented people around shared belief, ritual, and community, many of today’s symbolic systems orbit inward. Astrology, personality typologies, attachment styles, you name it, all offer a language not for God or society, but for me.
Philosophers from Nietzsche to Foucault warned that when traditional moral and religious structures dissolve, meaning doesn’t disappear. It collapses inward. The modern subject becomes both the question and the answer. Healing becomes a solo sport, and insight becomes a kind of virtue. The pursuit is bottomless, recursive, and rarely in service of others.
To be clear, I don’t think healing or self-discovery are bad. They’re necessary. But they were never meant to be the destination. They were meant to point outward, not become a closed loop.

The Mirror Problem
Maybe unsurprisingly, some of the biggest use cases for ChatGPT, therapy apps, and emerging AI hardware are “the mirror.” Extreme self-discovery.
Chat is an exceptional mirror.
On the more benign end of the spectrum, Chat will help you analyze your career, your childhood, your patterns, and your patterns about your patterns. But too much reflection, even if benign, can quickly slip into excessive introspection at best and rumination at worst.
Psychology has long drawn a distinction between reflection and rumination. Decades of research, most notably by Susan Nolen-Hoeksema, show that repetitive, self-focused thinking is strongly associated with anxiety and depression. More thinking or understanding doesn’t necessarily bring more clarity. Often, it just sustains distress.
Neuroscience and cognitive behavioral research suggest that when the nervous system is activated, more analysis, especially verbal analysis, can actually intensify distress. In those moments, clarity doesn’t come from going deeper. It comes from switching lanes, going for a walk, calling a friend. Letting the body settle before asking the mind to explain itself.
This is a core tenet of positive psychology: self-reflection alone will only take you so far. Well-being doesn’t come from endless self-understanding, but from engagement. From committing to things and letting meaning form through action. Responsibility. Community. Relationships. Effort. Contribution.
Perhaps unsurprisingly, early and mostly correlational research suggests that compulsive use of “Chat” is associated with higher loneliness, disrupted sleep, and emotional dependence.
But I think this data understates what’s actually happening. These outcomes may be less the core problem than symptoms of something upstream: over–self-analysis. We’ve introduced a machine that scales self-analysis and rumination at an unprecedented level. Even when the use feels benign, like weighing the pros and cons of sending an email, it can deepen internal monologue around things that don’t need it and gradually weaken bias toward action.
In that sense, the emerging data may be the canary in the coal mine, pointing to something more foundational. We’ve built a mirror that is beginning to reshape our relationship with the self and with our own intuition. That isn’t inherently bad, but it makes design choices matter in ways they never have before.
Right now, most AI products are designed to continue, elaborate, and explore nuance. Few recognize when insight has crossed into over-processing.
In traditional therapy, reflection comes with containment such as when language is leading to activation rather than resolution. And most importantly, therapy is constrained by time blocks on your calendar and not accessible 24/7.
The same is true with friends. Your friends only want to hear about your boss for mayybbeeee 20 minutes per hangout.
To be fair, AI doesn’t inherently have to enforce supercharged self-reflection or rumination. It has few natural constraints, which means it could just as easily interrupt reflection as extend it. It could redirect attention outward, nudge toward action, or know when to stop altogether. The problem isn’t the technology. It’s how it’s currently shaped and the incentives behind it.
I know my sun sign. I know my patterns.
Sometimes the most intelligent response isn’t another insight. It’s the mirror turning off.



yes!