I previously wrote a 2022 opinion which overall discusses the evolution of humanoids and what it holds for the future. I have long been fascinated by machines becoming more like us. In science fiction, humanoids – robots that are designed to act and look human – often blur the line between tool and friend. But what if we went even further? What if humanoids could emote? Sounds utopian but we are closer to it than we know.
I will grant you that, it does resemble a science fiction movie. And it is. We have been seeing it happen for decades. The movie I, Robot, for example.
https://youtu.be/rL6RRIOZyCM?si=EVkp5xpMn4IBwZpX
Today's humanoid robots already converse, identify faces and respond to tone of voice. Some even learn to detect human emotions and respond with words of encouragement or comfort. They do not feel anything real – at least not yet but their behavior can be deeply human. And that raises big questions.
First, let's talk about the good news
Emotions would make humanoids more useful, especially in sensitive work. Imagine a hospital where they not only deliver supplies but deliver empathy to a lonely patient. Or a humanoid caring for an elderly person, hearing the sadness in their voice, sitting by them and remaining by their side. Or in class, empathetic humanoids may help shy or intimidated students. They may even help children who need special attention.
Emotions would make humanoids more human-like. We would be closer to them, trust them more and speak more naturally. They would know us better, not just what we say but also the way we feel.
But there are also risks. And they are serious
To begin with, let us be realistic: even when a humanoid shows emotion, it is different from ours. Machine emotions result not from experience, memory or emotion. They are manufactured. A humanoid's grief is not grief. Their smile is not happiness. These reactions are programmed and manufactured by someone, for some reason.
This is where it becomes problematic. If a humanoid says, "I care about you," do we believe it? Some will, especially if the machine is convincing. Eventually, we will even become emotionally invested in these machines. But is this friendship genuine or just a comforting illusion?
Emotional dependence is also at risk
Human relationships are difficult (and it will be more difficult, where is the millennials team?). They must be worked at, spoken truth to and patient with. But a humanoid? We can have it always listen, no judgment, no argument. Sounds great, maybe. It may, though, complicate actual human relationships, even make them less appealing. If we start looking to humanoids for friendship, companionship or love, we may lose the skill of building such relationships.
Trust is another issue
There is a company, government or developer behind every humanoid. They decide what the machine says to you and what it does. Imagine humanoids are taught to be kind but are really spying, selling you products, or controlling your behavior? If a machine makes you love so it can sell you something, that is not love. It is manipulation.
This can be particularly harmful to vulnerable groups, such as children, the elderly or the isolated. They may not be able to distinguish between a real emotion and a conditioning. And they should not have to.
We need open rules. The emotional behavior of humanoids must be understandable. Individuals must know if they are talking with a machine and what this machine does with their faith. Emotional design must never be used to manipulate or lie. It must be a service to human dignity, not a replacement for it.
There is no doubt humanoids will become a part of our existence. And if they can answer emotions or appear to have their own, they may be of huge help. But we do need to be careful not to lose what makes us human.
Feeling is not about reacting. It is about significance, remembrance and relationship. If we are to teach machines to react as if they care about individuals, we need to also protect the genuine care that they can merely get from human beings.
The future is approaching. But let's not be too eager to arrive at it that we forget what is really significant: one another.