UPDATE: The Empathy Gap: Why Your 'Smarter' AI Is Lowering CSAT
PLUS: How to stress-test your AI's logic and a prompt for proactive journey design
Start every workday smarter. Spot AI opportunities faster. Become the go-to person on your team for what’s next.
Apologies - the previous edition was sent in error.
Today’s DCX AI Today is brought to you by Fin.
AI support that earns customer trust?
See what changes minds and what doesn’t.
Date: 🗓️ August 4, 2025 ⏱️ Read Time: ~5 minutes
👋 Welcome
We're obsessed with making our AI sound human, but we've forgotten to teach it how to listen. The current gold rush is toward conversational fluency, yet customers aren't asking for a better conversation partner. They're asking for a better problem-solver, and the most stubborn problems are steeped in emotion, not just logic.
📡 Signal in the Noise
The emerging data and research show a clear divergence: while AI is crushing operational metrics like speed and containment, it's struggling with—and sometimes damaging—the core human elements of trust, empathy, and confidence. The smartest companies are now looking past the hype of fluency and focusing on the new frontier: emotional resonance.
🧠 Executive Lens
Your AI strategy is about to be audited by regulators, whether you’re ready or not. The conversation is shifting from "what can AI do?" to "what must AI explain?" If your team can't articulate why your AI makes the decisions it does, you're not just facing a CX problem; you're facing a major compliance risk.
📰 Stories That Matter
📉 The AI efficiency trap is lowering customer satisfaction
A new analysis reveals a troubling trend: companies scaling generative AI for service are seeing first-contact resolution rates rise, but CSAT scores for complex or emotional journeys are falling. Researchers call this the "empathy gap," where the AI's focus on speedy, logical resolutions fails to address the customer's underlying emotional state, making them feel processed, not heard. The efficiency gains are masking a deeper erosion of customer trust.
Why This Matters: You might be hitting your operational KPIs while silently destroying customer loyalty.
Try This: Audit one "successfully resolved" AI interaction and ask: was the customer's emotional need met, or just their technical one?
⚖️ European regulators just put AI 'explainability' on notice
The EU's AI regulatory body issued new guidance requiring companies to provide "clear and straightforward explanations" for any AI-driven decisions that significantly impact a customer, such as credit denials or fraud alerts. This moves beyond vague privacy policies and demands that CX leaders be able to show, on a case-by-case basis, the logic their AI used. The era of the "black box" algorithm in customer-facing decisions is officially over.
Why This Matters: "Our algorithm decided" is no longer a valid response to a customer or a regulator.
Try This: Pick one automated customer decision and challenge your team to explain its logic in a single sentence a fifth-grader could understand.
🗣️ Your new AI voice might be eroding customer trust
A deep dive into the latest wave of ultra-realistic AI voice agents reveals a new kind of "auditory uncanny valley." While these voices are technically flawless, their lack of subtle, human imperfections—hesitations, slight pitch changes, filler words—can make them feel unsettling to customers. This subtle disconnect can erode trust and make the interaction feel deceptive, even if the information provided is accurate.
Why This Matters: The pursuit of vocal perfection could be making your brand feel less authentic and trustworthy.
Try This: Instead of optimizing for vocal perfection, experiment with AI voices that are intentionally designed to sound more "normally imperfect."
Source: https://www.theverge.com/2025/8/3/23489176/ai-voice-agents-uncanny-valley-customer-trust
🛠️ The home improvement giant using AI to prevent problems
A case study on Lowe's new supply chain AI shows a pivot from reactive to proactive service. By analyzing thousands of data points, the system predicts potential delivery delays before they happen and automatically alerts customers with a new ETA and a direct link to reschedule. It turns a negative event (a delay) into a positive, empowering interaction, dramatically reducing "Where is my order?" calls.
Why This Matters: You can resolve customer issues far more effectively by solving them before the customer is even aware they exist.
Try This: Identify your single most common service failure and brainstorm how you could use predictive AI to get ahead of it.
🧑🏫 How to teach AI like you'd train a human apprentice
Researchers are highlighting a technique called "observational scaffolding" to train more competent CX agents. Instead of just feeding the AI raw data, the model first "shadows" top-performing human agents on complex calls, learning conversational flow, empathy cues, and de-escalation tactics. The AI then acts as a co-pilot, suggesting responses before being gradually allowed to handle similar interactions on its own, like a human apprentice.
Why This Matters: Your best human agents—not just your data logs—are your most valuable AI training asset.
Try This: Create a small, elite team of your best agents to serve as "AI mentors" for a pilot program.
Source: https://ai.stanford.edu/blog/observational-scaffolding-for-cx-ai/
✍️ Prompt of the Day
Stress-Test Your AI's Logic
Act as a skeptical customer who has just been denied a request by our AI system. Your goal is to challenge the AI's logic and expose any "black box" reasoning.
The AI's decision was: [Insert a common automated decision, e.g., "Your request for a refund on order #12345 has been denied because it is outside the 30-day return window."]
Your task is to generate 5 probing, follow-up questions that a reasonable but persistent customer would ask to understand the *why* behind the decision. The questions should test for fairness, flexibility, and the data used. Do not be aggressive, but be firm.
What this uncovers/Immediate use case: It reveals how well your AI—and the human agents who rely on it—can explain its decisions under pressure, highlighting gaps in explainability.
How to apply it/Tactical benefit: Use the generated questions to build a new set of "explainability FAQs" and training scenarios for your agents.
Where to test/How to incorporate quickly: Role-play this with an agent who handles escalations from your AI chatbot to see how they currently respond.
🛠️ Try This Prompt
Act as a CX strategist using the case study of Lowe's proactive delivery AI as inspiration.
Our company's most common inbound service call is: [Insert your company's most frequent, predictable service failure, e.g., "confusion about how to use a specific feature in our software," or "questions about a bill's new charge"].
Your task is to design a proactive, AI-driven customer journey that makes that inbound call obsolete.
1. **Trigger Event:** What specific data or user behavior would signal that this problem is *about* to happen?
2. **Proactive Intervention:** What is the automated message or action the AI takes to get ahead of the problem?
3. **The New Experience:** Describe the ideal customer experience from the customer's point of view, where they feel informed and in control, not confused.
Immediate use case: Shifts your team's mindset from reactive problem-solving to proactive journey design.
Tactical benefit: Generates a concrete, actionable plan for a pilot project that could deliver significant ROI by reducing call volume for a key driver.
How to incorporate quickly: Use this as the agenda for a 45-minute workshop with your CX, data, and product teams.
📎 CX Note to Self
An efficient answer that lacks empathy is just a faster way to make a customer feel ignored.
👋 What's Your Assumption?
That's the signal from the noise for today. The real work isn't just implementing AI, it's questioning the assumptions we hold about it.
So here's my question for you: What's one belief about AI and customer experience you've seen challenged this month?
Hit reply and let me know. I read every single response.
Enjoy this newsletter? Please forward it to a friend who likes to think differently.
Talk tomorrow,
—Mark
💡 P.S. Want more prompts? Grab the FREE 32 Power Prompts That Will Change Your CX Strategy – Forever to start transforming your team, now. 👉 FREE 32 Power Prompts That Will Change Your CX Strategy – Forever
Special offer for DCX Readers:
The Complete AI Bundle for God of Prompt
Get 10% off your first purchase with Discount code: DI6W6FCD