obots can improve mental wellbeing in the workplace, but they have to look right, research suggests.
The perception of how effective the machines are depends in large part on what the robot looks like, according to the study.
University of Cambridge researchers carried out a study in a tech consultancy firm using two different robot wellbeing coaches.
Twenty-six employees took part in weekly robot-led wellbeing sessions for four weeks.
It could be that since the Misty robot is more toy-like, it matched their expectations
While the robots had identical voices, facial expressions, and scripts for the sessions, their physical appearance affected how people interacted with them.
Those who did their wellbeing exercises with a toy-like robot said they felt more of a connection with their “coach” than people who worked with a humanoid-like robot.
The researchers say that perception of robots is affected by popular culture, where the only limit on what robots can do is the imagination.
But when faced with a robot in the real world, it often does not live up to expectations.
According to the researchers, since the toy-like robot looks simpler, people may have had lower expectations and ended up finding the robot easier to talk to and connect with.
Those who worked with the humanoid robot found their expectations did not match reality, since the robot was not capable of having interactive conversations, they said.
The researchers collaborated with local technology company Cambridge Consultants to design and implement a workplace wellbeing programme using robots.
Over the course of four weeks, employees were guided through four different wellbeing exercises by one of two robots: either the QTRobot (QT) or the Misty II robot (Misty).
Our perceptions of how robots should look or behave might be holding back the uptake of robotics in areas where they can be useful
The QT is a childlike humanoid robot and roughly 90cm tall, while Misty is a 36cm tall toy-like robot.
Both have screen faces that can be programmed with different facial expressions.
Dr Micol Spitale, the paper’s first author, said: “It could be that since the Misty robot is more toy-like, it matched their expectations.
“But since QT is more humanoid, they expected it to behave like a human, which may be why participants who worked with QT were slightly underwhelmed.”
After speaking to different wellbeing coaches, the researchers programmed the robots to have a coach-like personality, with high openness and conscientiousness.
Professor Hatice Gunes, from Cambridge’s Department of Computer Science and Technology, who led the research, said: “The most common response we had from participants was that their expectations of the robot didn’t match with reality.
“We programmed the robots with a script, but participants were hoping there would be more interactivity.
“It’s incredibly difficult to create a robot that’s capable of natural conversation. New developments in large language models could really be beneficial in this respect.”
Co-author Minja Axelsson said: “Our perceptions of how robots should look or behave might be holding back the uptake of robotics in areas where they can be useful.”
The findings are presented at the ACM/IEEE International Conference on Human-Robot Interaction in Stockholm.