Project description

Anticipate robot movements (Intendicate)

Intendicate – An intuitive and iconic method to indicate robotic motion

Most industrial robots convey very little to no information about their future movements. This makes fluid human-robot coordination virtually impossible and also presents a safety risk. Addressing this problem was the goal of the Intendicate project. We aimed to develop a method for robots to communicate their planned movements in an intuitively understandable way, inspired by a certain aspect of interpersonal interaction: humans track the gaze of others and thus recognise where their attention is directed. This behaviour is deeply rooted in us and happens without us having to learn it. Eye movements trigger our attention. So, the research question was: How can we transfer this aspect of natural interaction to human-robot interaction?

In collaboration with the Humboldt University of Berlin and the design agency why do birds, we researched and developed a solution that essentially consisted of a pair of highly abstracted eyes shown as an animation on a display mounted on the robot. Both the design and the animation needed to be human-like enough to trigger attention, but not so naturalistic that they would appear creepy or weird in context with the robot. Various designs were developed and tested in a series of trials. The result is a demonstrator that was able to provide a well-founded proof of concept in the final evaluation: People were able to predict the robot's movements earlier and with less effort when supported by the developed solution.

Project partners

  • Firmenlogo von Humboldt Universität zu Berlin
  • Firmenlogo von why do birds GmbH

Contact person

Foto von Paul Schweidler

M.Sc.

Paul Schweidler

schweidler@human-factors.de

Related Publications

Onnasch, L., Schweidler, P., & Schmidt, H. (2023). The potential of robot eyes as predictive cues in HRI—an eye-tracking study. Frontiers in Robotics and AI, 10, 1178433. https://doi.org/10.3389/frobt.2023.1178433

Onnasch, L., Schweidler, P., & Wieser, M. (2023, March). Effects of predictive robot eyes on trust and task performance in an industrial cooperation task. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (pp. 442-446). https://doi.org/10.1145/3568294.3580123

Schweidler, P., & Onnasch, L. Using Functionally Anthropomorphic Eyes to Indicate Robotic Motion. In 1st International Conference on Hybrid Societies 2023. Download (PDF)

Onnasch, L., Kostadinova, E., & Schweidler, P. (2022). Humans can’t resist robot eyes–reflexive cueing with pseudo-social stimuli. Frontiers in Robotics and AI, 72. https://doi.org/10.3389/frobt.2022.848295

Onnasch, L., Schmidt, H., & Schweidler, P. (2022, September). Effects of predictive robot eyes on attentional processes in HRI. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 66, No. 1, pp. 549-549). Sage CA: Los Angeles, CA: SAGE Publications.

Explore more