通过草图合成的非配对草图到线条的转换

Gayoung Lee, Dohyun Kim, Y. Yoo, Dongyoon Han, Jung-Woo Ha, Jaehyuk Chang
{"title":"通过草图合成的非配对草图到线条的转换","authors":"Gayoung Lee, Dohyun Kim, Y. Yoo, Dongyoon Han, Jung-Woo Ha, Jaehyuk Chang","doi":"10.1145/3355088.3365163","DOIUrl":null,"url":null,"abstract":"Converting hand-drawn sketches into clean line drawings is a crucial step for diverse artistic works such as comics and product designs. Recent data-driven methods using deep learning have shown their great abilities to automatically simplify sketches on raster images. Since it is difficult to collect or generate paired sketch and line images, lack of training data is a main obstacle to use these models. In this paper, we propose a training scheme that requires only unpaired sketch and line images for learning sketch-to-line translation. To do this, we first generate realistic paired sketch and line images from unpaired sketch and line images using rule-based line augmentation and unsupervised texture conversion. Next, with our synthetic paired data, we train a model for sketch-to-line translation using supervised learning. Compared to unsupervised methods that use cycle consistency losses, our model shows better performance at removing noisy strokes. We also show that our model simplifies complicated sketches better than models trained on a limited number of handcrafted paired data.","PeriodicalId":435930,"journal":{"name":"SIGGRAPH Asia 2019 Technical Briefs","volume":"63 7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Unpaired Sketch-to-Line Translation via Synthesis of Sketches\",\"authors\":\"Gayoung Lee, Dohyun Kim, Y. Yoo, Dongyoon Han, Jung-Woo Ha, Jaehyuk Chang\",\"doi\":\"10.1145/3355088.3365163\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Converting hand-drawn sketches into clean line drawings is a crucial step for diverse artistic works such as comics and product designs. Recent data-driven methods using deep learning have shown their great abilities to automatically simplify sketches on raster images. Since it is difficult to collect or generate paired sketch and line images, lack of training data is a main obstacle to use these models. In this paper, we propose a training scheme that requires only unpaired sketch and line images for learning sketch-to-line translation. To do this, we first generate realistic paired sketch and line images from unpaired sketch and line images using rule-based line augmentation and unsupervised texture conversion. Next, with our synthetic paired data, we train a model for sketch-to-line translation using supervised learning. Compared to unsupervised methods that use cycle consistency losses, our model shows better performance at removing noisy strokes. We also show that our model simplifies complicated sketches better than models trained on a limited number of handcrafted paired data.\",\"PeriodicalId\":435930,\"journal\":{\"name\":\"SIGGRAPH Asia 2019 Technical Briefs\",\"volume\":\"63 7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIGGRAPH Asia 2019 Technical Briefs\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3355088.3365163\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIGGRAPH Asia 2019 Technical Briefs","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3355088.3365163","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

将手绘草图转换成清晰的线条图是漫画和产品设计等多种艺术作品的关键步骤。最近使用深度学习的数据驱动方法已经显示出它们在自动简化光栅图像上的草图方面的巨大能力。由于难以收集或生成配对的素描和线条图像,缺乏训练数据是使用这些模型的主要障碍。在本文中,我们提出了一种只需要未配对的素描和线条图像来学习素描到线条翻译的训练方案。为此,我们首先使用基于规则的线增强和无监督纹理转换,从未配对的素描和线条图像中生成逼真的成对素描和线条图像。接下来,使用我们的合成配对数据,我们使用监督学习训练一个草图到线条翻译的模型。与使用循环一致性损失的无监督方法相比,我们的模型在去除噪声冲程方面表现出更好的性能。我们还表明,我们的模型比在有限数量的手工配对数据上训练的模型更好地简化了复杂的草图。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Unpaired Sketch-to-Line Translation via Synthesis of Sketches
Converting hand-drawn sketches into clean line drawings is a crucial step for diverse artistic works such as comics and product designs. Recent data-driven methods using deep learning have shown their great abilities to automatically simplify sketches on raster images. Since it is difficult to collect or generate paired sketch and line images, lack of training data is a main obstacle to use these models. In this paper, we propose a training scheme that requires only unpaired sketch and line images for learning sketch-to-line translation. To do this, we first generate realistic paired sketch and line images from unpaired sketch and line images using rule-based line augmentation and unsupervised texture conversion. Next, with our synthetic paired data, we train a model for sketch-to-line translation using supervised learning. Compared to unsupervised methods that use cycle consistency losses, our model shows better performance at removing noisy strokes. We also show that our model simplifies complicated sketches better than models trained on a limited number of handcrafted paired data.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Faster RPNN: Rendering Clouds with Latent Space Light Probes Flexible Ray Traversal with an Extended Programming Model Augmented Reality Guided Respiratory Liver Tumors Punctures: A Preliminary Feasibility Study Beyond the Screen Embedded Concave Micromirror Array-based See-through Light Field Near-eye Display
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1