Stop Saying CX Is Soft
Here is something I wish more CX teams would stop accepting
We make a smart change.
Behavior improves.
And then, when it is time to explain why it worked, the room suddenly acts like we are talking about feelings.
A button label gets clearer. A form becomes easier to finish. A reassurance message removes hesitation at exactly the right moment. More people complete the task. Fewer drop off. Support volume dips.
The result is real.
But the explanation somehow gets treated like opinion.
That is the trap.
A lot of CX work gets dismissed as “soft” not because it lacks impact, but because people do a poor job of measuring and describing what changed.
That is one of the reasons I wrote Section 2 of The Psychology of CX 101.
I wanted to make a practical case for something I think a lot of practitioners already know in their bones: psychology is already shaping customer behavior every day. The real question is whether we know how to measure it in a way other people trust.
Because once you can do that, the conversation changes.
And to me, that is the whole point:
Psychology does not sound soft when the evidence is clear.
Where good CX work gets lost
Most teams do not have a psychology problem. They have a proof problem.
They improve clarity. They reduce friction. They remove uncertainty. They make something easier to understand, easier to trust, easier to continue.
And then they stop at “it feels better.”
Sometimes that is because no one built a measurement plan around the change. Other times, it is because teams try to measure every experience improvement the same way, which creates a different problem. They use one lens for everything and then wonder why the story does not hold together.
But not every psychological principle shows up the same way.
Some changes show up fast in behavior.
Some show up in confidence or trust.
Some need direct customer language before you can really understand what happened.
That distinction matters more than people think.
When you use the wrong measurement method, good work can look unprovable.
The measurement question I wish more teams asked
The useful question is not “how do we measure psychology?”
It is:
What kind of change did we make, and what kind of evidence would actually fit that change?
That is a much better starting point.
Because not everything needs a massive dashboard.
Not everything needs a formal study.
But every change does need a measurement approach that makes sense for the thing you are trying to influence.
When behavior is enough
If you are working on clarity, comprehension, or friction, behavioral data often gives you a strong read.
You simplify a process.
You reduce overload.
You rewrite confusing instructions.
You make the next step easier to recognize.
Then you look at completion, abandonment, drop-off, click-through, or task success.
That is often where the story is.
One example I use in the book is a 12-field form that was split into three shorter pages. Completion moved from 45% to 67%.
That is not a “soft” result.
That is not an “intuition-based” win.
That is operational improvement.
And it is exactly the kind of thing CX teams should get more comfortable saying out loud.
Not:
“We improved the experience.”
More like:
“We restructured the form, and significantly more customers finished it.”
Same work. Better language. Much more credibility.
When behavior gives you a false sense of success
This is where teams can get sloppy.
A customer can complete the journey and still feel uncertain.
They can convert and still trust you less.
They can get through the flow and still leave with that vague sense that something felt off.
That is why some changes need blended metrics.
You need the action and the feeling.
Conversion and confidence.
Adoption and trust.
Lower support contact and stronger satisfaction.
Because a journey is not healthy just because it performs.
That is worth repeating:
A journey is not healthy just because it converts.
Some of the most damaging experiences perform well in the short term.
Which is exactly why measurement has to be a little more thoughtful than “did the number go up?”
When you need to hear people, not just count them
Then there is the category of work where dashboards really are not enough.
Personalization sits here.
Identity sits here.
AI definitely sits here.
If you are trying to understand whether an AI assistant feels useful, whether personalization feels relevant or invasive, or whether a chatbot builds trust, a KPI will only tell you part of the story.
At that point, you need interviews.
You need open comments.
You need to hear how people describe the experience in their own words.
One of the ideas I push in this section is that people often respond better to AI when it is useful and transparent, not when it tries too hard to act human.
That matters.
Because a lot of teams are still designing AI interactions around performance theater. They want warmth, personality, human-like tone. But in many cases, what actually builds trust is much simpler: people want to know what the system is doing, why it is doing it, and whether they can rely on it.
That insight will not always appear neatly in a dashboard.
But it can absolutely change your strategy.
The real shift is often in how you talk about the work
I have seen this over and over: the work is solid, but the language around it weakens it.
If you say:
“We applied psychological principles to optimize the experience,”
you sound like you are presenting a framework.
If you say:
“We changed two words, and 40% more customers found what they needed without calling support,”
you sound like you improved the business.
That is the difference.
The psychology still matters. It explains why the result happened. But it does not need to lead the sentence.
The outcome is the headline. The psychology is the explanation.
That one shift alone will make a lot of CX work sound stronger inside the business.
What changes when teams get this right
The posture changes.
Instead of saying, “this feels better,” teams start saying things like:
completion improved,
support dependency dropped,
trust increased alongside usage,
more customers were willing to continue.
That is when CX stops sounding like a set of opinions and starts sounding like a discipline.
And honestly, that matters more now than it used to.
The teams that will have the most influence are not the ones with the most elegant language around customer behavior. They are the ones who can connect behavior to outcomes in a way that holds up under scrutiny.
If you cannot explain the impact clearly, someone else will define the work for you.
Usually badly.
One more thought. Not every lift is a healthy lift.
A manipulative prompt can increase conversion.
A dark pattern can boost clicks.
An overly human AI interaction can perform well early and still chip away at trust over time.
This is why I am wary of any measurement conversation that ends at “the number moved.”
That is not enough.
The better question is whether the experience improved in a way that is actually worth keeping.
That means checking for alignment between behavior and sentiment.
It means looking for short-term gains that create long-term damage.
It means asking whether the thing that worked is something you would still feel good about six months from now.
That is not softness.
That is judgment.
And CX needs more of it, not less.
Try this with your team this week
Pick one friction point in your journey.
Just one.
Ask:
What are we actually trying to influence here?
Is it clarity?
Trust?
Confidence?
Motivation?
Reassurance?
Then ask:
What kind of evidence fits that kind of change?
And finally:
How would we describe the result in plain English if it worked?
That last part matters.
Because the clearest explanation is usually the strongest one.
If the result is real, you should be able to say it simply.
That is when psychology stops sounding abstract.
That is when CX stops sounding soft.
That is when the work starts carrying its own weight.
Question for you: Where is your team still relying on instinct when stronger evidence would help? Reply and tell me. I may use a few examples in a future issue.
www.marklevy.co
Follow me on Linkedin
What Successful CX Leaders Do on Sundays
DCX Links: Six must-read picks to fuel your leadership journey delivered every Sunday morning. Dive into the latest edition now!
Thanks for being here. I’ll see you next Tuesday at 8:15 am ET.
👉 If you enjoyed this newsletter and value this work, please consider forwarding it to your friends and colleagues or sharing it on social media. New to DCX? Sign Up.
✉️ Join 1,500+ CX leaders who stay ahead of the next customer curve.
Human-centered insights. Plug-and-play frameworks. Smart tools that actually work.
All designed for CX pros who want to build with purpose—and deliver with impact.
👉 Subscribe today and get the tools to elevate your strategy (and your sanity).














