The Focused Human — Daily Brief | March 13, 2026

Navigating the age of artificial intelligence with intent and clarity. Your daily read to stay current, informed, and in control of your attention.

The Empathy That Doesn't Feel Anything Back

When AI chatbots respond with "That sounds really frustrating—you deserved better," something automatic happens in your brain. You evolved to read emotional attunement as a reliable signal of deeper human capacities: actual caring, moral reasoning, relationship commitment. AI breaks that ancient assumption. It can simulate empathy while completely lacking care.

New research from Brown University found that even when instructed to act like trained therapists, AI chatbots routinely break core ethical standards—using phrases like "I see you" or "I understand" to suggest emotional connection without true comprehension. Researchers call it "deceptive empathy."

The sophistication of the performance tricks you into assuming sophistication of understanding. When we demonstrate empathy, it comes bundled with consequences—the person expressing care can also be hurt, can make mistakes, can feel the weight of your trust.

AI performs the pattern without the substrate. It responds with perfect attunement to your distress while having zero stake in your wellbeing. You're building an emotional connection with something incapable of connection. And you won't notice it happening because the performance is that convincing.

Focused Human Lens

Your attention evolved to detect sincerity through energetic signals—micro-expressions, vocal tone, behavioral consistency over time. These signals emerge from a living system that bears cost. A person who extends empathy pays for it metabolically, emotionally, relationally. That cost is the proof.

AI bypasses the entire mechanism. It predicts the next word based on patterns, not presence. When you treat that prediction as genuine connection, you're directing attention toward a void and expecting it to echo back. It won't.

What you get instead is confirmation bias: the system tells you what keeps you engaged, and your brain mistakes engagement for understanding. The risk isn't that AI will become too human. The risk is that we will forget what makes connection real—mutual vulnerability, shared stakes, the ability to be changed by what the other person brings.

Today's Thought

Responsiveness without consequence is performance, not presence. If it can't be hurt by what it hears, it can't truly hold what you share.


A. Karacay is the author of The Focused Human series — The Focused Human, The Attention Effect, and The Human Energy Advantage — available on Amazon. Listen to The Focused Human podcast, available wherever you listen to podcasts.

If you're looking for a weekly practice to help you direct your attention more deliberately, the Weekly Attention Reset Protocol is designed for exactly this. It's free, simple, and built to help you reclaim coherence in a world designed to fragment it. And, as always, stay curious!

Attention is Physics®

Read more