Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Looking into gait for perceiving emotions via bilateral posture and movement graph convolutional networks

Zhai, Yingjie, Jia, Guoli, Lai, Yu-Kun ORCID: https://orcid.org/0000-0002-2094-5680, Zhang, Jing, Yang, Jufeng and Tao, Dacheng 2024. Looking into gait for perceiving emotions via bilateral posture and movement graph convolutional networks. IEEE Transactions on Affective Computing 15 (3) , pp. 1634-1648. 10.1109/TAFFC.2024.3365694

[thumbnail of TAFFC_GaitEmotion_BPM_GCN.pdf]
Preview
PDF - Accepted Post-Print Version
Available under License Creative Commons Attribution.

Download (1MB) | Preview

Abstract

Emotions can be perceived from a person's gait, i.e., their walking style. Existing methods on gait emotion recognition mainly leverage the posture information as input, but ignore the body movement, which contains complementary information for recognizing emotions evoked in the gait. In this paper, we propose a Bilateral Posture and Movement Graph Convolutional Network (BPM-GCN) that consists of two parallel streams, namely posture stream and movement stream, to recognize emotions from two views. The posture stream aims to explicitly analyse the emotional state of the person. Specifically, we design a novel regression constraint based on the hand-engineered features to distill the prior affective knowledge into the network and boost the representation learning. The movement stream is designed to describe the intensity of the emotion, which is an implicitly cue for recognizing emotions. To achieve this goal, we employ a higher-order velocity-acceleration pair to construct graphs, in which the informative movement features are utilized. Besides, we design a PM-Interacted feature fusion mechanism to adaptively integrate the features from the two streams. Therefore, the two streams collaboratively contribute to the performance from two complementary views. Extensive experiments on the largest benchmark dataset Emotion-Gait show that BPM-GCN performs favorably against the state-of-the-art approaches (with at least 4.59% performance improvement). The source code is released on https://github.com/exped1230/BPM-GCN .

Item Type: Article
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Publisher: Institute of Electrical and Electronics Engineers
ISSN: 1949-3045
Funders: Natural Science Foundation of Tianjin Municipality, Fundamental Research Funds for the Central Universities
Date of First Compliant Deposit: 21 March 2024
Date of Acceptance: 2 February 2024
Last Modified: 05 Nov 2024 16:00
URI: https://orca.cardiff.ac.uk/id/eprint/167409

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics