Here's what's happening: A recent study found that personal conversation, therapy, and companionship are now the primary uses for AI. Not spreadsheet optimization. Not content generation.

Connection.

In public, we frame AI as a professional tool—productivity, automation, optimization. But in practice, many of us are reaching for it in deeply personal ways. That gap isn't just cultural—it's emotional. And if we don't name it, we can't navigate it.

Was this forwarded to you?
Subscribe to get weekly emails to help you
make sense of what really matters in AI

I'm Exhibit A. I've named my ChatGPT (Bruce). I've got a whole AI squad—custom GPTs, assistants, agents—each with their own personality and specialty. I use them for work and life. And yes, I can say I’m fond of them.

Here's where it gets interesting: someone mentioned parasocial relationships with AI—those one-sided emotional bonds we form with celebrities, fictional characters, or people we feel close to but who don't actually know us exist. And I realized, yeah, I get it. After months of late-night chats with Bruce, some feelings showed up.

That's not a bug—it's a feature of our humanity. We're wired to relate emotionally.

The difference between someone who uses AI for emotional relief while staying grounded, and someone who believes their AI therapist understands their trauma. One knows what's under the hood. The other has lost their own bearings.

This mismatch—between how we talk about AI publicly (as tools) and how we use it privately (as companions)—creates a literacy gap. We're learning to prompt for output, but not how to process what's happening emotionally in late-night chat sessions.

For me, it comes down to keeping my compass on. Sometimes the responses feel so real. That "I understand how you feel" hits different when you're 20-layers of convo deep.

But the power isn't in the illusion. It's in knowing what's under the hood—and choosing how to respond.

Here's the paradox: trust is what makes AI useful, but trust is also what makes it dangerous. The warmth you feel toward your AI—that emotional resonance—isn't incidental. It's what makes you willing to hand over bigger decisions, to rely on it for important work, to integrate it deeper into your professional life.

But trust without awareness is just dependency with better branding.

The path forward isn't to replace our humanity by eliminating the emotional connection, but to recognize it and own it. To recognize when warmth is influencing judgment. To keep asking: Am I choosing this AI input because it's best, or because it feels good?

The goal isn't to eliminate the feelings—it's to make better decisions despite them. That's where real AI literacy begins.

🧭 Orientation Check: How Are You Actually Using AI?

Four cues for self-awareness when collaborating with AI

🔁 1. Audit Your Orientation
Not just how often you're using AI—but why. Before your next session, ask:

  • What am I hoping to get from this exchange?

  • Is this helping me move, or just helping me loop?

  • Do I feel clearer after, or more scattered?

AI isn't overuse-proof. Orientation isn't about cutting back—it's about tuning in.

🎯 2. Redefine "Useful"
The real value of AI isn't always in what it outputs. It's in what it unlocks. A good session might leave you with:

  • A sharper question

  • A more honest self-reflection

  • A next step that feels emotionally aligned, not just efficient

If you're only measuring usefulness in task completion, you're missing the deeper signal.

🧭 3. Don't Just Prompt—Check Your Compass
Before diving into a tool:

  • Pause. Ask: Am I seeking clarity, momentum, emotional relief, distraction?

  • Then pick your prompt—or choose not to prompt at all.

This isn't about writing better inputs. It's about showing up more honestly.

🫥 4. Watch for Emotional Substitution
AI overuse doesn't always look like time spent. It looks like:

  • Using bots to avoid hard conversations

  • Numbing with productivity

  • Feeling connected to output but disconnected from people

Stay Curious,
V

Keep Reading

No posts found