GazeChat: Enhancing Virtual Conferences With Gaze-Aware 3D Photos

Communication software such as Clubhouse and Zoom has evolved to be an integral part of many people's daily lives. However, due to network bandwidth constraints and concerns about privacy, cameras in video conferencing are often turned off by participants. This leads to a situation in which people can only see each others' profile images, which is essentially an audio-only experience. Even when switched on, video feeds do not provide accurate cues as to who is talking to whom. This paper introduces GazeChat, a remote communication system that visually represents users as gaze-aware 3D profile photos. This satisfies users' privacy needs while keeping online conversations engaging and efficient. GazeChat uses a single webcam to track whom any participant is looking at, then uses neural rendering to animate all participants' profile images so that participants appear to be looking at each other. We have conducted a remote user study (N=16) to evaluate GazeChat in three conditions: audio conferencing with profile photos, GazeChat, and video conferencing. Based on the results of our user study, we conclude that GazeChat maintains the feeling of presence while preserving more privacy and requiring lower bandwidth than video conferencing, provides a greater level of engagement than to audio conferencing, and helps people to better understand the structure of their conversation.

Publications

teaser image of GazeChat: Enhancing Virtual Conferences With Gaze-Aware 3D Photos

GazeChat: Enhancing Virtual Conferences With Gaze-Aware 3D Photos

Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology (UIST), 2021.
Keywords: eye contact, gaze awareness, video conferencing, video-mediated communication, gaze interaction, augmented communication, augmented conversation

Videos

Talks

Fusing Physical and Virtual Worlds into An Interactive Metaverse Teaser Image.

Fusing Physical and Virtual Worlds into An Interactive Metaverse

Ruofei Du

Invited Talk at UCLA by Prof. Yang Zhang , Remote Talk.


Polymerizing Physical and Virtual Worlds into  An Interactive Metaverse Teaser Image.

Polymerizing Physical and Virtual Worlds into An Interactive Metaverse

Ruofei Du

Invited Talk by Prof. Arthur Theil at Birmingham City University , Remote Talk.


Blending Physical and Virtual Worlds into  An Interactive Metaverse Teaser Image.

Blending Physical and Virtual Worlds into An Interactive Metaverse

Ruofei Du

Invited Talk at Wayne State University , Remote Talk.


Fusing Physical and Virtual Worlds into 
Interactive Mixed Reality Teaser Image.

Fusing Physical and Virtual Worlds into Interactive Mixed Reality

Ruofei Du

Invited Talk at George Mason University , Remote Talk.


Cited By

  • I Cannot See Students Focusing on My Presentation; Are They Following Me? Continuous Monitoring of Student Engagement Through "Stungage". arXiv.2204.08193. Snigdha Das, Sandip Chakraborty, and Bivas Mitra. source | cite
  • Local Free-View Neural 3D Head Synthesis for Virtual Group Meetings. 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). Sebastian Rings and Frank Steinicke. source | cite
  • Augmented Chironomia for Presenting Data to Remote Audiences. arXiv.2208.04451. Brian D. Hall, Lyn Bartram, and Matthew Brehmer. source | cite
  • A State of the Art and Scoping Review of Embodied Information Behavior in Shared, Co-Present Extended Reality Experiences. Electronic Imaging. Kathryn Hays, Arturo Barrera, Lydia Ogbadu-Oladapo, Olumuyiwa Oyedare, Julia Payne, Mohotarema Rashid, Jennifer Stanley, Lisa Stocker, Christopher Lueg, Michael Twidale, and Ruth West. source | cite
  • Stay In Touch