Using Eye-tracking Technology for Coordinating Communication in Teams

  • Subject:Intelligent User Interface;Eye-tracking
  • Type:Master Thesis
  • Date:07.04.2021
  • Supervisor:

    Dr. Peyman Toreini 

  • Person in Charge:Moritz Langner
  • Add on:

    Status: Taken

  • Problem Description

    In the field of business intelligence and analytics (BI&A), BI&A dashboards refer to the graphical user interface in which (analytical) information is presented. Decision-makers use dashboards in team meetings in order discuss the status quo and the planning of the next steps. Decision makers in teams can successfully communicate while they have proper joint attention on the BI&A dashboards during communication. Having joint attention during such communication is know to be required to support the team to make appropriate decisions.
    Attention-aware BI&A dashboards refer to dashboards that are sensitive to the attention of the decision makers. Eye-movement data represents an indicator of attention allocation. This information is known as a source for designing novel intelligent user interfaces (IUI) and has been used broadly in different fields. Using real-time eye movement information when users are remotely located and communicating with each other can enable a system to gain more contextual information about the user’s communication and therefore support coordinating it. To reach this purpose, eye-tracking devices collect different data such as fixation, pupil, saccade, gaze position, etc. and use them as an input to increase the intelligence level of the shared user interface. This includes analyzing user’s joint attentional behaviors, estimation, and prediction of the cognitive state of the users using gaze information.

    Goal of Thesis

    A better understanding of the benefits of shared gaze on the performance of decision maker teams while using jointly a BI&A dashboard:

    • Investigate the current state of the art in attention-aware UIs that coordinate communications through eye-movement data
    • Design and develop shared gaze feature for a BI&A dashboard to coordinate communication (focus on a dyad in a first step)
    • Evaluate the feature with regards joint performance

    Skills required

    • Strong analytical skills
    • Very good time management, organizational and communication skills
    • Programming skills(e.g. Python or C# programming)
    • Interest in BI&A research and specifically, attention-aware BI&A dashboards
    • Interest in designing intelligent user interfaces
    • Good English skills

    References:

    1. Laura Dabbish and Robert Kraut. 2008. Awareness displays and social motivation for coordinating communication. Information Systems Research 19, 2: 221–238.
    2. Yanxia Zhang, Ken Pfeuffer, Ming Ki, Chong Jason, Andreas Bulling, and Hans Gellersen. 2017. Look together : using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21
    3. Xu, Bin, Jason Ellis, and Thomas Erickson. "Attention from Afar: Simulating the Gazes of Remote Participants in Hybrid Meetings." Proceedings of the 2017 Conference on Designing Interactive Systems. ACM, 2017.
    4. Brennan, Susan E., et al. "Coordinating cognition: The costs and benefits of shared gaze during collaborative search." Cognition 106.3 (2008): 1465-1477.
    5. Claudia Roda and Julie Thomas. 2006. Attention aware systems: Theories, applications, and research agenda. Computers in Human Behavior 22, 4: 557–587.