Huo, Jing, Jin, Shiyin, Li, Jiashen, Tian, Pingzhuo, Li, Wenbin, Wu, Jing ![]() ![]() |
Abstract
Most collection-based style transfer methods require training a separate model for each individual collection of styles, making the extension to multiple collections of styles less flexible. Besides, the existing collection-based methods are also less flexible in extending to new style collections in a continual manner. To address these issues, we propose a novel MultI-Dictionary Generative Adversarial Network framework (MID-GAN) for multi-collection style transfer. Specifically, we design a multi-dictionary architecture within a GAN, with each dictionary consisting of a set of local style codes for a specific style collection. Benefiting from the local style codes used in the dictionary, a stylization module with aligned skip connections is further proposed, which can better preserve both the local details and the overall image structure. The dictionary design allows a flexible extension to new style collections by readily adding new dictionaries and we propose a continual training strategy that can both preserve the style transfer ability of old styles and achieve good transfer results for newly added styles. Extensive experiments are performed to show that the proposed method is better than existing collection-based style transfer methods. We also demonstrate the proposed method can generate diverse meaningful style transfer results of the same style collection.
Item Type: | Article |
---|---|
Date Type: | Published Online |
Status: | In Press |
Schools: | Schools > Computer Science & Informatics |
Additional Information: | License information from Publisher: LICENSE 1: URL: https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html, Start Date: 2025-01-01 |
Publisher: | Institute of Electrical and Electronics Engineers |
ISSN: | 1520-9210 |
Last Modified: | 28 Aug 2025 10:00 |
URI: | https://orca.cardiff.ac.uk/id/eprint/180714 |
Actions (repository staff only)
![]() |
Edit Item |