Yi, Ran, Liu, Yong-Jin, Lai, Yu-Kun ORCID: https://orcid.org/0000-0002-2094-5680 and Rosin, Paul L. ORCID: https://orcid.org/0000-0002-4965-3884 2020. Unpaired portrait drawing generation via asymmetric cycle mapping. Presented at: Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, Washington, USA, 16-18 June 2020. |
Preview |
PDF
- Accepted Post-Print Version
Download (8MB) | Preview |
Abstract
Portrait drawing is a common form of art with high abstraction and expressiveness. Due to its unique characteristics, existing methods achieve decent results only with paired training data, which is costly and time-consuming to obtain. In this paper, we address the problem of automatic transfer from face photos to portrait drawings with unpaired training data. We observe that due to the significantimbalanceofinformationrichnessbetweenphotosand drawings, existing unpaired transfer methods such as CycleGAN tend to embed invisible reconstruction information indiscriminately in the whole drawings, leading to important facial features partially missing in drawings. To address this problem, we propose a novel asymmetric cycle mapping that enforces the reconstruction information to be visible (by a truncation loss) and only embedded in selective facial regions (by a relaxed forward cycle-consistency loss). Alongwithlocalizeddiscriminatorsfortheeyes,nose andlips,ourmethodwellpreservesallimportantfacialfeatures in the generated portrait drawings. By introducing a style classifier and taking the style vector into account, our method can learn to generate portrait drawings in multiple styles using a single network. Extensive experiments show that our model outperforms state-of-the-art methods.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Status: | In Press |
Schools: | Computer Science & Informatics |
Funders: | The Royal Society |
Date of First Compliant Deposit: | 30 March 2020 |
Date of Acceptance: | 27 February 2020 |
Last Modified: | 26 Aug 2023 17:39 |
URI: | https://orca.cardiff.ac.uk/id/eprint/130656 |
Citation Data
Cited 31 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
Edit Item |