Your Customers Can’t Tell What’s Real Anymore
Deepfakes aren’t just changing media — they’re changing trust. Here’s what that means for every CX leader.
When Deepfakes Meet Customer Experience: Real Connection Becomes a Luxury
You get an email from your CEO.
There’s a short video — a thank-you message to loyal customers. Her eyes blink. Her voice cracks just right. She laughs at her own joke.
It feels real. Until you learn it isn’t.
That moment captures the new customer experience dilemma: when everything can be faked, trust becomes fragile.
Deepfakes — hyper-realistic AI-generated audio or video — are spilling into marketing, customer service, and internal communications.
The problem isn’t just fraud.
It’s confusion.
When customers stop believing what they see, your brand stops being believable.
Pindrop reports that over the last four years, phone channel fraud has increased by 350 percent, many targeting customer-facing or finance roles.
And once trust breaks, no amount of personalization or automation can glue it back together.
Here’s the Strange Opportunity Hidden in All This
If you’re in CX, this is your moment to lead — not to panic.
When fakery floods the market, authenticity becomes your competitive edge.
Customers aren’t just asking “Was this easy?” anymore.
They’re asking, “Was this real?”
That’s a different kind of KPI.
Brands that answer it clearly — by disclosing AI use, proving content origin, and prioritizing truth over polish — will win customer loyalty faster than those who hide behind convenience.
Think of authenticity as the new UX.
Just as “secure checkout” reassured e-commerce buyers, “verified human” will soon reassure digital customers.
Designing for Reality
Let’s get practical. Here’s how you make authenticity a working part of your CX strategy.
1. Build an authenticity layer
Start thinking about how your digital interactions can include verifiable signals of realness.
Watermark or tag AI-generated content so users know what they’re seeing.
Incorporate chain-of-custody metadata for videos (who created, who approved, when).
Use human avatars when they represent real people; avoid synthetic people claiming human authenticity.
2. Train your team to spot and respond to deepfakes
Your customer-facing teams, and your brand guardians, must become deepfake-aware. The threats extend beyond fraud into brand reputation. A fake “message from our CEO” can become viral and undermine your company’s integrity. Training should include: “Could this be synthetic?” “Does this feel off?” “What verification do we have?”
3. Disclose AI participation and set expectations
If part of the interaction uses AI (voice assistant, video avatar, generative text), disclose it. Transparency builds trust. If you hide it, and customers feel tricked, you lose authenticity.
4. Redefine CX metrics to include authenticity
You measure NPS, CSAT, CES — now add “perceived authenticity/trust.” Ask: “Did the customer feel confident this interaction was real?” Use survey prompts or sentiment analysis to capture that.
5. Humanize your brand voice
AI can mimic tone, but it struggles with intent, humility, error-acknowledgement. Use your real people — your real front-line voices — to anchor authenticity. Make sure customers know there’s a heartbeat behind the interface.
6. Monitor and mitigate brand misuse
Even if you’re doing the “right” thing, brand imposters might use your identity in deepfake scams. Keep alert to unauthorized use of your brand or spokespeople. Coming back to examples: criminals impersonating executives, fake deepfake ads of well-known figures, highlight how real the risk is.
Lessons from the Field
Here’s where theory meets the messy, modern marketplace—let’s ground this with real-world cases that show both the upside and the risks.
1. When Deepfakes Drive Engagement — But Blur the Line
In its “#whereeveryouare” campaign, Zalando used deepfake-style technology of Cara Delevingne across some 290,000 localised variants of ads. (Vogue Business)
On the one hand, the campaign shows how brands can push scale, personalisation and novelty using synthetic media. On the other hand, it raises questions: if the face/music/lips are “cloned,” is the connection with the consumer still “real”? And how does the consumer feel when the “person” isn’t actually there?
Key takeaway: Using deepfakes for brand storytelling can boost novelty and reach—but it also creates authenticity risks that CX teams must manage.
2. When Deepfakes Breach Trust — And the Bill Hits $25 Million
Arup, the global engineering firm, fell victim to a deepfake video-conference scam. An employee joined what appeared to be a routine meeting with their CFO. Every face on screen was fake. The scammers walked away with $25 million (Coverlink).
CX Insight: If your employees can’t trust who they see, your customers can’t either. Trust isn’t departmental — it’s systemic.
3. When Synthetic Media Powers Meaningful Storytelling
Cadbury used synthetic media in its Diwali campaign, letting local shop owners appear alongside a virtual Shah Rukh Khan—the beloved Bollywood star—to promote small businesses (CognitivePath).
Here, the deepfake created value, not confusion. The campaign clearly disclosed the technology, enhancing trust while empowering local entrepreneurs.
The Results
Over 130,000 personalized video ads were created and shared by shopkeepers.
The campaign reached more than 20 million people across social media.
Earned widespread press coverage and positive public sentiment.
Won Cannes Lions and Clio Awards for innovation and brand purpose.
But the bigger win was emotional. Local shop owners felt seen. Customers felt a human connection, even though the celebrity wasn’t physically there.
Key takeaway: Transparency is everything. Synthetic storytelling can work—if you never pretend it’s real.
4. When the Voice on the Call Isn’t Your CEO
Fraudsters created an AI-generated voice clone and video of senior executives at WPP, the world’s largest advertising firm, then set up a fraudulent meeting to persuade an agency leader to commit to business and reveal sensitive information. (The Guardian)
This isn’t just a cybersecurity story — it shows how deepfakes can undermine brand integrity and stakeholder trust. If an external party can convincingly impersonate your leadership, customer-facing messages lose credibility.
Key takeaway: CX and brand teams must assume that everything digital could be faked. You need validation and verification protocols — not just design or convenience.
5. When Deepfakes Damage More Than Reputations
AI-generated images of Taylor Swift (and others) proliferated across social platforms without consent, some labelled pornography, some mis-used for endorsements. The sheer volume and visible spread caused a reputational hit and public outrage. (The Guardian).
When the public sees brands seemingly endorsing synthetic or manipulated content — or targets of deepfakes where consumer sympathy crosses to the brand — the cost is more than embarrassment. Customers may question if user-reviews, influencers or brand spokespeople are real.
Key takeaway: Your brand doesn’t just need to guard against being faked — you have to guard against being tainted by association with synthetic content. Authenticity signals matter more than ever.
When the Real Becomes Rare
Deepfakes aren’t just a tech or fraud issue. They’re a CX issue—and a brand issue.
When every visual, every voice, every testimonial could be synthetic, trust becomes fragile. For CX leaders, this means:
You can’t assume authenticity—it must be designed.
You can’t treat trust as passive—you must actively signal and protect it.
You can’t rely only on speed or personalization—human contact and transparent authenticity become strategic levers.
Let me leave you with this question:
If your customers suddenly doubted the authenticity of your interactions, would they still believe they were connecting with you?
Because in a world of synthetic everything, real connection has become a luxury. And those brands that guard it, prove it, own it—that’s who will lead.
💡 Your Move
This week: audit one customer-touchpoint. Pick one where you use video, voice, social proof or endorsement. Ask:
Is the person real or synthetic?
If synthetic, did we clearly disclose it?
Did the customer know who they were dealing with?
Can we add a signal that says “This is verified real”?
If not, that’s your next CX project.
What Successful CX Leaders Do on Sundays
DCX Links: Six must-read picks to fuel your leadership journey delivered every Sunday morning. Dive into the latest edition now!
👋 Please Reach Out
I created this newsletter to help customer-obsessed pros like you deliver exceptional experiences and tackle challenges head-on. But honestly? The best part is connecting with awesome, like-minded people—just like you! 😊
Here’s how you can get involved:
Got feedback? Tell me what’s working, what’s not, or what you’d love to see next.
Stuck on something? Whether it’s a CX challenge, strategy question, or team issue, hit me up—I’m here to help.
Just want to say hi? Seriously, don’t be shy. I’d love to connect, share ideas, or even swap success stories.
Your input keeps this newsletter fresh and valuable. Let’s start a conversation—email me, DM me, or comment anytime. Can’t wait to hear from you!
— Mark
www.marklevy.co
Follow me on Linkedin
Thanks for being here. I’ll see you next Tuesday at 8:15 am ET.
👉 If you enjoyed this newsletter and value this work, please consider forwarding it to your friends and colleagues or sharing it on social media. New to DCX? Sign Up.
✉️ Join 1,420+ CX Leaders Who Get the DCX Newsletter First
The DCX Newsletter helps CX professionals stay ahead of change with insights that educate, inspire, and coach you toward action.
Subscribe to get the next issue straight to your inbox—and keep building experiences that move before your customers do.











