{"title":"Style Image Retrieval for Improving Material Translation Using Neural Style Transfer","authors":"Gibran Benitez-Garcia, Wataru Shimoda, Keiji Yanai","doi":"10.1145/3379173.3393707","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a CNN-feature-based image retrieval method to find the ideal style image that better translates the material of an object. An ideal style image must share semantic information with the content image, while containing distinctive characteristics of the desired material. Therefore, we first refine the search by selecting the most discriminative images from the target material. Subsequently, our search process focuses on the object semantics by removing the style information using instance normalization whitening. Thus, the search is performed using the normalized CNN features. In order to translate materials to object regions, we combine semantic segmentation with neural style transfer. We segment objects from the content image by using a weakly supervised segmentation method, and transfer the material of the retrieved style image to the segmented areas. We demonstrate quantitatively and qualitatively that by using ideal style images, the results of the conventional neural style transfer are significantly improved, overcoming state-of-the-art approaches, such as WCT, MUNIT, and StarGAN.","PeriodicalId":416027,"journal":{"name":"Proceedings of the 2020 Joint Workshop on Multimedia Artworks Analysis and Attractiveness Computing in Multimedia","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 Joint Workshop on Multimedia Artworks Analysis and Attractiveness Computing in Multimedia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3379173.3393707","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
In this paper, we propose a CNN-feature-based image retrieval method to find the ideal style image that better translates the material of an object. An ideal style image must share semantic information with the content image, while containing distinctive characteristics of the desired material. Therefore, we first refine the search by selecting the most discriminative images from the target material. Subsequently, our search process focuses on the object semantics by removing the style information using instance normalization whitening. Thus, the search is performed using the normalized CNN features. In order to translate materials to object regions, we combine semantic segmentation with neural style transfer. We segment objects from the content image by using a weakly supervised segmentation method, and transfer the material of the retrieved style image to the segmented areas. We demonstrate quantitatively and qualitatively that by using ideal style images, the results of the conventional neural style transfer are significantly improved, overcoming state-of-the-art approaches, such as WCT, MUNIT, and StarGAN.