Gaze-based Attention System

for Multi-human Multi-robot Interaction

I Direct Gaze


In humans and other social animals, gaze-direction serves an important role in regulating the communication between individualsIn this work, each human can “select” (obtain the undivided attention of) a ground robot and interact with it by simply gazing (looking directly) at it.

Each robot optimally assigns human identities to tracked faces in its camera view using a local Hungarian algorithm. The gaze-direction of the faces will be estimated via vision, and a score for each robot-face pair will be assigned (higher score means higher probability of attehntion). Then the system finds the global optimal allocation of robot-to-human selections using a second, centralized, Hungarian algorithm.

System flowchart:

This extends existing work whereby a single human can select one or more robots from a population. It is accepted as a workshop paper in HRI 2016.

Optimal Gaze-Based Robot Selection in Multi-Human Multi-Robot Interaction
L. Zhang, R. Vaughan
Human-Robot Interaction Pioneers Workshop at the 2016 ACM/IEEE International Conference on Human-Robot Interaction (HRI 2016 Workshop), Christchurch, New Zealand, March 2016

Demo video:

II Indirect Gaze

In a multiple human interaction scenario as shown in the figure, a human can be told that someone else is seeking his attention even when he can not see that person himself.

To allow the robots to achive this, in addition to the face pose, the position of the face is also estimated. Vicon motion capture system is used to get the position of the ground robots. Since information about detected user locations and gaze directions is shared among the robots via a centralized server using WiFi. A useful feature of this method is that robots can be selected by people they cannot see.

System Flowchart:

This is the first demonstration of optimal many-to-many robot-selection HRI. It is accepted as a full paper in IROS 2016

Optimal Robot Selection by Gaze Direction in Multi-Human Multi-Robot Interaction
L. Zhang, R. Vaughan
2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016), Korea, October 2016.

Demo video:

III UAV Interaction

Micro Feedback Behaviour

Instead of selecting a robot by simply gazing, this works extends the previous by using a series of “micro interaction” and “micro feedback”.

  • Preselect The human can preselect a robot by simply gazing at it. The robot will spin to the direction facing the human and LED lights turn to yellow.
  • Lock The human nods the head so that the robot will be locked to the human and cannot be selected by anyone else.
  • Lost When the robot is selected by a human but this human is not in the camera view of the robot at the moment, LED lights turn to red.
  • Unlock The human shake the head so that the robot will be unlocked.
  • Deom video:

    Multiple UAV Selection

    This work involves the experiment mentioned in section I and II using flying vhiecles.

    Demo video: