When human counselors are unavailable to provide work-based wellness coaching, robots can substitute—as long as the workers are comfortable with emerging technologies and the machines aren’t overly humanlike.
This phenomenon has been called the uncanny valley effect, and researchers at the University of Cambridge in the U.K. seem to have caught it lurking when they tested two types of robotic coaches on employees of a tech company.
The team is presenting the project this week in Stockholm, Sweden, at the ACM/IEEE International Conference on Human-Robot Interaction [1].
Computer scientist Hatice Gunes, PhD, and colleagues collaborated with the global tech consultancy Cambridge Consultants to have 26 of the company’s staff members interact with either the 3-foot-tall humanoid QTrobot (LuxAI, Luxembourg City) or the small, toylike Misty robot (Misty Robotics, Boulder, Colo.) over a four-week span.
‘Valuable insights for robotic wellbeing coach design’
Prior to the project, the robots were customized to deliver wellness coaching as guided by professional mental health workers and as grounded in the relevant peer-reviewed literature.
For the study, Gunes and colleagues had the robots lead the workers through a series of positive psychology exercises at the rate of one session per week.
Upon analyzing quantitative data gathered via standardized and specifically designed questionnaires, along with qualitative data from in-person interviews and focus groups, the researchers found “the robot form significantly impacts coachees’ perceptions of the robotic coach in the workplace.”
Specifically, the coached workers much preferred the diminutive, cartoonish Misty to the humanoid QTrobot, perceiving the former to have more appropriate behavior and relatable personality.
The employees also felt a greater sense of “connection” with Misty, the authors report.
“Our study provides valuable insights for robotic wellbeing coach design and deployment and contributes to the vision of taking robotic coaches into the real world,” Gunes and co-authors comment in their discussion section.
When great expectations collide with ho-hum reality
In coverage of the research by the university’s news division, the study’s first author, Micol Spitale, PhD, says the affinity that participants had for the Misty robot may have represented a case of matched expectations.
“Since QT is more humanoid, they expected it to behave like a human, which may be why participants who worked with QT were slightly underwhelmed,” Spitale adds.
Gunes, the study’s senior author, seems to concur.
“The most common response we had from participants was that their expectations of the robot didn’t match with reality,” she says. “New developments in large language models could really be beneficial in this respect.”
Gunes suggests workplace robot coaches can serve the practical aim of reminding workers, in a hard-to-ignore way, to keep up with a wellness regimen.
“Just saying things out loud, even to a robot,” she adds, “can be helpful when you’re trying to improve mental wellbeing.”
The study is available in full for free (click PDF).