Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

3D face reconstruction and gaze tracking in the HMD for virtual interaction

Chen, Shu-Yu, Lai, Yu-Kun ORCID: https://orcid.org/0000-0002-2094-5680, Xia, Shihong, Rosin, Paul ORCID: https://orcid.org/0000-0002-4965-3884 and Gao, Lin 2023. 3D face reconstruction and gaze tracking in the HMD for virtual interaction. IEEE Transactions on Multimedia 25 , pp. 3166-3179. 10.1109/TMM.2022.3156820

[thumbnail of FaceVR_TMM.pdf.pdf]
Preview
PDF - Accepted Post-Print Version
Download (17MB) | Preview

Abstract

With the rapid development of virtual reality (VR) technology, VR headsets, a.k.a. Head-Mounted Displays (HMDs), are widely available, allowing immersive 3D content to be viewed. A natural need for truly immersive VR is to allow bidirectional communication: the user should be able to interact with the virtual world using facial expressions and eye gaze, in addition to traditional means of interaction. The typical application scenario includes VR virtual conferencing and virtual roaming, where ideally users are able to see other users expressions and have eye contact with them in the virtual world. In addition, eye gaze also provides a natural means of interaction with virtual objects. Despite significant achievements in recent years for reconstruction of 3D faces from RGB or RGB-D images, it remains a challenge to reliably capture and reconstruct 3D facial expressions including eye gaze when the user is wearing an HMD, because the majority of the face is occluded, especially those areas around the eyes which are essential for recognizing facial expressions and eye gaze. In this paper, we introduce a novel real-time system that is able to capture and reconstruct 3D faces wearing HMDs, and robustly recover eye gaze. We further propose a novel method to map eye gaze directions to the 3D virtual world, which provides a novel and useful interactive mode in VR. We compare our method with state of-the-art techniques both qualitatively and quantitatively, and demonstrate the effectiveness of our system using live capture.

Item Type: Article
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Publisher: Institute of Electrical and Electronics Engineers
ISSN: 1520-9210
Funders: The Royal Society
Date of First Compliant Deposit: 11 March 2022
Date of Acceptance: 16 February 2022
Last Modified: 06 Nov 2023 21:42
URI: https://orca.cardiff.ac.uk/id/eprint/148340

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics