Participants wearing eye-tracking glasses.
© E. Böker, CASCB

Visual ecology of human communication

The dynamics of head and eye gaze between two or more individuals displayed during verbal and non-verbal face-to-face communication contains a wealth of information. In human face-to-face communication, it is used for both volitionary and unconscious signalling. During computer-mediated video communication visual signals about gaze behaviour and other
deictic cues are produced and conveyed, but the information they carry is often spurious and potentially misleading.

This project aims to study eye contact behaviour between communicating human interlocutors by measuring mutual gaze during face-to-face communication, in conventional screen-based communication situations, but also while connecting communication partners with MPdepth, a new display technology that has been developed by visiting professor Nikolaus Troje and the BioMotionLab at York University. The technology employs head and face tracking functionality to re-introduce natural eye-contact behaviour into screen-based video communication.

Participants wearing eye-tracking glasses communicating via tablet.
© E. Böker, CASCB

For the current project, we will elaborate on eye tracking technology and motion capture technology that has been pioneered by cluster members Prasetia Puta and Fumihiro Kano.

Participants will be recruited in pairs and asked to engage in short conversations where participants play “Heads-Up”, a short cooperative game. In addition to studying eye movements during conversation, we will add measures of autonomic arousal and its fluctuation over time to assess levels of synchronization between interlocutors. These involve measuring galvanic skin response, heart rate variability and pupil dilation/constriction simultaneously from both communication partners. Synchrony of autonomic arousal responses between partners has been shown to be more pronounced during face-to-face communication than during video communication, but it is not known whether these differences are due to differences in eye contact behaviour. Here, we are particularly interested if the MPdepth communication platform is resulting in eye contact behaviour and autonomous arousal measures that are comparable to the ones observed in real-life face-to-face communication.