Yuliang Xue, Nan Zhong, Zhenxing Qian, Xinpeng Zhang
{"title":"PSTNet: Protectable Style Transfer Network Based on Steganography","authors":"Yuliang Xue, Nan Zhong, Zhenxing Qian, Xinpeng Zhang","doi":"10.1109/CoST57098.2022.00021","DOIUrl":null,"url":null,"abstract":"Neural style transfer (NST) is a technique based on deep learning that preserves the content of an image and converts its style to a target style. In recent years, NST has been widely used to generate new artworks based on existent styles to promote cultural communication. However, there is little research that considers the protection of copyright during the generation of stylised images. To this end, we propose an end-to-end protectable style transfer network based on steganography, called PSTNet. This network, including a pair of encoder and decoder, takes a content image and copyright information as input. The encoder embeds copyright information directly into the input content image and render the content image in a specific style. When the copyright needs to be verified, only the corresponding decoder can extract copyright information correctly. Furthermore, an elaborated designed noise layer is added between the encoder and decoder to improve the robustness of the copyright protection method. Experiments show that the protectable stylised images generated by PSTNet have significant visual effects and the undetectability of copyright information is proved by steganalysis. In addition, our method is robust enough that the copyright of generated stylised images can still be proved even after spreading on real social networks. We hope this work will raise awareness of the protection of artworks created by NST.","PeriodicalId":135595,"journal":{"name":"2022 International Conference on Culture-Oriented Science and Technology (CoST)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Culture-Oriented Science and Technology (CoST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CoST57098.2022.00021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Neural style transfer (NST) is a technique based on deep learning that preserves the content of an image and converts its style to a target style. In recent years, NST has been widely used to generate new artworks based on existent styles to promote cultural communication. However, there is little research that considers the protection of copyright during the generation of stylised images. To this end, we propose an end-to-end protectable style transfer network based on steganography, called PSTNet. This network, including a pair of encoder and decoder, takes a content image and copyright information as input. The encoder embeds copyright information directly into the input content image and render the content image in a specific style. When the copyright needs to be verified, only the corresponding decoder can extract copyright information correctly. Furthermore, an elaborated designed noise layer is added between the encoder and decoder to improve the robustness of the copyright protection method. Experiments show that the protectable stylised images generated by PSTNet have significant visual effects and the undetectability of copyright information is proved by steganalysis. In addition, our method is robust enough that the copyright of generated stylised images can still be proved even after spreading on real social networks. We hope this work will raise awareness of the protection of artworks created by NST.