Benpeng Su , Xuxing Liu , Weize Gao , Ye Yang , Shanxiong Chen
{"title":"一种基于对偶生成对抗网络的汉字复原方法","authors":"Benpeng Su , Xuxing Liu , Weize Gao , Ye Yang , Shanxiong Chen","doi":"10.1016/j.visinf.2022.02.001","DOIUrl":null,"url":null,"abstract":"<div><p>Ancient books that record the history of different periods are precious for human civilization. But the protection of them is facing serious problems such as aging. It is significant to repair the damaged characters in ancient books and restore their original textures. The requirement of the restoration of the damaged character is keeping the stroke shape correct and the font style consistent. In order to solve these problems, this paper proposes a new restoration method based on generative adversarial networks. We use the shape restoration network to complete the stroke shape recovery and the font style recovery. The texture repair network is responsible for reconstructing texture details. In order to improve the accuracy of the generator in the shape restoration network, we use the adversarial feature loss (AFL), which can update the generator and discriminator synchronously to replace the traditional perceptual loss. Meanwhile, the font style loss is proposed to maintain the stylistic consistency for the whole character. Our model is evaluated on the datasets Yi and Qing, and shows that it outperforms current state-of-the-art techniques quantitatively and qualitatively. In particular, the Structural Similarity has increased by 8.0% and 6.7% respectively on the two datasets.</p></div>","PeriodicalId":36903,"journal":{"name":"Visual Informatics","volume":"6 1","pages":"Pages 26-34"},"PeriodicalIF":3.8000,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2468502X22000092/pdfft?md5=d3ed2a6a34178c2af83ce73f8cd4a7d0&pid=1-s2.0-S2468502X22000092-main.pdf","citationCount":"6","resultStr":"{\"title\":\"A restoration method using dual generate adversarial networks for Chinese ancient characters\",\"authors\":\"Benpeng Su , Xuxing Liu , Weize Gao , Ye Yang , Shanxiong Chen\",\"doi\":\"10.1016/j.visinf.2022.02.001\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Ancient books that record the history of different periods are precious for human civilization. But the protection of them is facing serious problems such as aging. It is significant to repair the damaged characters in ancient books and restore their original textures. The requirement of the restoration of the damaged character is keeping the stroke shape correct and the font style consistent. In order to solve these problems, this paper proposes a new restoration method based on generative adversarial networks. We use the shape restoration network to complete the stroke shape recovery and the font style recovery. The texture repair network is responsible for reconstructing texture details. In order to improve the accuracy of the generator in the shape restoration network, we use the adversarial feature loss (AFL), which can update the generator and discriminator synchronously to replace the traditional perceptual loss. Meanwhile, the font style loss is proposed to maintain the stylistic consistency for the whole character. Our model is evaluated on the datasets Yi and Qing, and shows that it outperforms current state-of-the-art techniques quantitatively and qualitatively. In particular, the Structural Similarity has increased by 8.0% and 6.7% respectively on the two datasets.</p></div>\",\"PeriodicalId\":36903,\"journal\":{\"name\":\"Visual Informatics\",\"volume\":\"6 1\",\"pages\":\"Pages 26-34\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2022-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2468502X22000092/pdfft?md5=d3ed2a6a34178c2af83ce73f8cd4a7d0&pid=1-s2.0-S2468502X22000092-main.pdf\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Visual Informatics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2468502X22000092\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Visual Informatics","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2468502X22000092","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
A restoration method using dual generate adversarial networks for Chinese ancient characters
Ancient books that record the history of different periods are precious for human civilization. But the protection of them is facing serious problems such as aging. It is significant to repair the damaged characters in ancient books and restore their original textures. The requirement of the restoration of the damaged character is keeping the stroke shape correct and the font style consistent. In order to solve these problems, this paper proposes a new restoration method based on generative adversarial networks. We use the shape restoration network to complete the stroke shape recovery and the font style recovery. The texture repair network is responsible for reconstructing texture details. In order to improve the accuracy of the generator in the shape restoration network, we use the adversarial feature loss (AFL), which can update the generator and discriminator synchronously to replace the traditional perceptual loss. Meanwhile, the font style loss is proposed to maintain the stylistic consistency for the whole character. Our model is evaluated on the datasets Yi and Qing, and shows that it outperforms current state-of-the-art techniques quantitatively and qualitatively. In particular, the Structural Similarity has increased by 8.0% and 6.7% respectively on the two datasets.