Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

MH-HMR: Human mesh recovery from monocular images via multi-hypothesis learning

Xuan, Haibiao, Zhang, Jinsong, Lai, Yukun ORCID: and Li, Kun 2024. MH-HMR: Human mesh recovery from monocular images via multi-hypothesis learning. CAAI Transactions on Intelligence Technology 10.1049/cit2.12337

[thumbnail of CAAI Trans on Intel Tech - 2024 - Xuan - MH‐HMR  Human mesh recovery from monocular images via multi‐hypothesis learning.pdf]
PDF - Published Version
Available under License Creative Commons Attribution.

Download (7MB) | Preview


Recovering 3D human meshes from monocular images is an inherently ill-posed and challenging task due to depth ambiguity, joint occlusion, and truncation. However, most existing approaches do not model such uncertainties, typically yielding a single reconstruction for one input. In contrast, the ambiguity of the reconstruction is embraced and the problem is considered as an inverse problem for which multiple feasible solutions exist. To address these issues, the authors propose a multi-hypothesis approach, multi-hypothesis human mesh recovery (MH-HMR), to efficiently model the multi-hypothesis representation and build strong relationships among the hypothetical features. Specifically, the task is decomposed into three stages: (1) generating a reasonable set of initial recovery results (i.e., multiple hypotheses) given a single colour image; (2) modelling intra-hypothesis refinement to enhance every single-hypothesis feature; and (3) establishing inter-hypothesis communication and regressing the final human meshes. Meanwhile, the authors take further advantage of multiple hypotheses and the recovery process to achieve human mesh recovery from multiple uncalibrated views. Compared with state-of-the-art methods, the MH-HMR approach achieves superior performance and recovers more accurate human meshes on challenging benchmark datasets, such as Human3.6M and 3DPW, while demonstrating the effectiveness across a variety of settings. The code will be publicly available at

Item Type: Article
Date Type: Published Online
Status: In Press
Schools: Computer Science & Informatics
Publisher: Wiley Open Access
ISSN: 2468-2322
Funders: National Key Research and Development Program of China, National Natural Science Foundation of China, Science Fund for Distinguished Young Scholars of Tianjin Municipality
Date of First Compliant Deposit: 21 March 2024
Date of Acceptance: 15 November 2023
Last Modified: 30 Apr 2024 13:17

Actions (repository staff only)

Edit Item Edit Item


Downloads per month over past year

View more statistics