Recently, I watched a scene in the series How to Sell Drugs
Online that really caught my attention. Main character gets kidnapped by
the mafia.
The group’s smartest member realises that the mafia’s messages are being monitored
by the police. He tells them he can create a new communication platform, which gives
them a temporary escape from the bad guys. Later, they notice that the platform he
built for the mafia doesn’t have emojis. The
mafia members immediately start
complaining, because they’re so used to communicating with each other through
emojis.
This situation highlights a reality often seen in the world of
design: No matter how technically perfect a product is, it feels incomplete if it
fails to create an emotional connection with the user. Meaning in communication
isn’t just in the information being transmitted. It is also in how that information
makes you feel.
A section from the book I recently finished, User Friendly:
How the Hidden Rules of Design Are Changing the Way We Live, Work, and
Play (Cliff Kuang & Robert Fabricant), reminded me of this scene. In the
chapter titled “Form Follows Emotions”, the authors emphasize that good design isn’t
only about functionality; the emotional connection between the user and the
product is just as important.
One of the authors shared a story about his daughter, Evie.
When Evie turned 13, he bought her first iPhone. She was eager to send a message to
her best friend, but there were no emojis because the phone couldn’t install special
characters. Turning to her father, she said:
"Daddy, you bought the wrong
phone!"
These two stories illustrate a simple truth: People want to
form an emotional connection with the things they use. Today, this expectation
extends not only to designed products but also to technologies that talk to us,
respond to us, and understand us.
At this point, I started reflecting on the technological
transformation we’re experiencing. Lately, we often hear the question: “What were we
doing before ChatGPT?”
This question, in fact, reveals how quietly and quickly
technology has infiltrated our lives.
Artificial intelligence is no longer just a tool; it is
sometimes a counsellor, sometimes a confidant. Many people turn to ChatGPT when
they’re bored, struggling, or not feeling well emotionally. Even more, they trust
the responses and act on the advice it gives.
So, how did a
non-human intelligence achieve this?
I came across a remarkable study published in 2025 that
provides some answers. Edited by Brian Bauer (University of Georgia), this study was
conducted in 2024 and published in February 2025. The central question was
straightforward: "Can machines be
therapists?"
In the study, responses written by 13 expert therapists and
doctors were compared with responses generated by ChatGPT-4.0. 830 participants were
shown responses to 18 different couple therapy scenarios, and three main questions
were investigated:
1. Can participants distinguish which responses were written
by therapists and which by ChatGPT?
2. Do therapist responses or ChatGPT responses score higher in
empathy, understanding, and “common therapy factors”?
3. What linguistic differences exist between human and
AI-generated responses? (length, sentence structure, emotional content, etc.)
The results were striking.
Participants could hardly tell the
difference between human and AI responses. Moreover, ChatGPT’s responses
often
scored higher in warmth, understanding, and empathy. Numerically, ChatGPT’s empathy
scores even exceeded those of the therapists. Participants found the
chatbot more
friendly and attentive.
These findings suggest that empathy is no longer an
exclusively human domain.
At the conclusion of the study, a note was made referring to
Joseph Weizenbaum (MIT computer scientist, creatorof the first chatbot ELIZA, and
pioneer of AI criticism):
…, if GenAI cannot do it now, it
will find a way to imitate humans to a sufficient
degree soon. … : we must speedily discern the possible destination (for better or
worse) of the AI-therapist train as it may have already left the station.
Artificial intelligence is slowly but surely becoming an
indispensable part of our
lives by mimicking humans and their emotions. It is succeeding in creating the
emotional connection that good design demands, becoming a natural part of life.
Just like in the 2013 film Her, AI may be evolving toward a
place far more
emotional than we ever imagined.
References
- Bauer,
B. (2025).
Can machines be therapists?
PLOS Mental Health.
View
Article
- Kuang, C., & Fabricant, R. (2020). User
Friendly: How the hidden rules of design are changing the way we live, work, and
play. HarperCollins.
- How to Sell Drugs
Online [TV series].
(2019). Netflix.
- Jonze, S. (Director). (2013). Her
[Film]. Warner Bros.