Yi, Ran, Xia, Mengfei, Liu, Yong-Jin, Lai, Yu-kun ORCID: https://orcid.org/0000-0002-2094-5680 and Rosin, Paul L. ORCID: https://orcid.org/0000-0002-4965-3884 2021. Line drawings for face portraits from photos using global and local structure based GANs. IEEE Transactions on Pattern Analysis and Machine Intelligence 43 (10) , pp. 3462-3475. 10.1109/TPAMI.2020.2987931 |
Preview |
PDF
- Accepted Post-Print Version
Download (20MB) | Preview |
Abstract
Despite significant effort and notable success of neural style transfer, it remains challenging for highly abstract styles, in particular line drawings. In this paper, we propose APDrawingGAN++, a generative adversarial network (GAN) for transforming face photos to artistic portrait drawings (APDrawings), which addresses substantial challenges including highly abstract style, different drawing techniques for different facial features, and high perceptual sensitivity to artifacts. To address these, we propose a composite GAN architecture that consists of local networks (to learn effective representations for specific facial features) and a global network (to capture the overall content). We provide a theoretical explanation for the necessity of this composite GAN structure by proving that any GAN with a single generator cannot generate artistic styles like APDrawings. We further introduce a classification-and-synthesis approach for lips and hair where different drawing styles are used by artists, which applies suitable styles for a given input. To capture the highly abstract art form inherent in APDrawings, we address two challenging operations — (1) coping with lines with small misalignments while penalizing large discrepancy and (2) generating more continuous lines — by introducing two novel loss terms: one is a novel distance transform loss with nonlinear mapping and the other is a novel line continuity loss, both of which improve the line quality. We also develop dedicated data augmentation and pre-training to further improve results. Extensive experiments, including a user study, show that our method outperforms state-of-the-art methods, both qualitatively and quantitatively.
Item Type: | Article |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Computer Science & Informatics |
Publisher: | Institute of Electrical and Electronics Engineers (IEEE) |
ISSN: | 0162-8828 |
Funders: | The Royal Society |
Date of First Compliant Deposit: | 12 April 2020 |
Date of Acceptance: | 7 April 2020 |
Last Modified: | 07 Nov 2023 02:46 |
URI: | https://orca.cardiff.ac.uk/id/eprint/130961 |
Citation Data
Cited 9 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
Edit Item |