Flow style-aware network for arbitrary style transfer

IF 2.5 4区 计算机科学 Q2 COMPUTER SCIENCE, SOFTWARE ENGINEERING Computers & Graphics-Uk Pub Date : 2024-09-29 DOI:10.1016/j.cag.2024.104098
Zhenshan Hu, Bin Ge, Chenxing Xia, Wenyan Wu, Guangao Zhou, Baotong Wang
{"title":"Flow style-aware network for arbitrary style transfer","authors":"Zhenshan Hu,&nbsp;Bin Ge,&nbsp;Chenxing Xia,&nbsp;Wenyan Wu,&nbsp;Guangao Zhou,&nbsp;Baotong Wang","doi":"10.1016/j.cag.2024.104098","DOIUrl":null,"url":null,"abstract":"<div><div>Researchers have recently proposed arbitrary style transfer methods based on various model frameworks. Although all of them have achieved good results, they still face the problems of insufficient stylization, artifacts and inadequate retention of content structure. In order to solve these problems, we propose a flow style-aware network (FSANet) for arbitrary style transfer, which combines a VGG network and a flow network. FSANet consists of a flow style transfer module (FSTM), a dynamic regulation attention module (DRAM), and a style feature interaction module (SFIM). The flow style transfer module uses the reversible residue block features of the flow network to create a sample feature containing the target content and style. To adapt the FSTM to VGG networks, we design the dynamic regulation attention module and exploit the sample features both at the channel and pixel levels. The style feature interaction module computes a style tensor that optimizes the fused features. Extensive qualitative and quantitative experiments demonstrate that our proposed FSANet can effectively avoid artifacts and enhance the preservation of content details while migrating style features.</div></div>","PeriodicalId":50628,"journal":{"name":"Computers & Graphics-Uk","volume":"124 ","pages":"Article 104098"},"PeriodicalIF":2.5000,"publicationDate":"2024-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Graphics-Uk","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0097849324002334","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

Abstract

Researchers have recently proposed arbitrary style transfer methods based on various model frameworks. Although all of them have achieved good results, they still face the problems of insufficient stylization, artifacts and inadequate retention of content structure. In order to solve these problems, we propose a flow style-aware network (FSANet) for arbitrary style transfer, which combines a VGG network and a flow network. FSANet consists of a flow style transfer module (FSTM), a dynamic regulation attention module (DRAM), and a style feature interaction module (SFIM). The flow style transfer module uses the reversible residue block features of the flow network to create a sample feature containing the target content and style. To adapt the FSTM to VGG networks, we design the dynamic regulation attention module and exploit the sample features both at the channel and pixel levels. The style feature interaction module computes a style tensor that optimizes the fused features. Extensive qualitative and quantitative experiments demonstrate that our proposed FSANet can effectively avoid artifacts and enhance the preservation of content details while migrating style features.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用于任意样式传输的流量样式感知网络
最近,研究人员提出了基于各种模型框架的任意文体转换方法。虽然这些方法都取得了不错的效果,但仍然面临着风格化不足、伪原创和内容结构保留不充分等问题。为了解决这些问题,我们提出了一种用于任意风格转移的流风格感知网络(FSANet),它将 VGG 网络和流网络相结合。FSANet 由流量风格传输模块(FSTM)、动态调节关注模块(DRAM)和风格特征交互模块(SFIM)组成。流式传输模块利用流式网络的可逆残差块特征创建包含目标内容和风格的样本特征。为使 FSTM 适应 VGG 网络,我们设计了动态调节关注模块,并在通道和像素层面利用样本特征。风格特征交互模块可计算出优化融合特征的风格张量。广泛的定性和定量实验证明,我们提出的 FSANet 可以在迁移风格特征时有效避免伪影,并增强对内容细节的保护。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computers & Graphics-Uk
Computers & Graphics-Uk 工程技术-计算机:软件工程
CiteScore
5.30
自引率
12.00%
发文量
173
审稿时长
38 days
期刊介绍: Computers & Graphics is dedicated to disseminate information on research and applications of computer graphics (CG) techniques. The journal encourages articles on: 1. Research and applications of interactive computer graphics. We are particularly interested in novel interaction techniques and applications of CG to problem domains. 2. State-of-the-art papers on late-breaking, cutting-edge research on CG. 3. Information on innovative uses of graphics principles and technologies. 4. Tutorial papers on both teaching CG principles and innovative uses of CG in education.
期刊最新文献
Enhancing Visual Analytics systems with guidance: A task-driven methodology Learning geometric complexes for 3D shape classification RenalViz: Visual analysis of cohorts with chronic kidney disease Enhancing semantic mapping in text-to-image diffusion via Gather-and-Bind CGLight: An effective indoor illumination estimation method based on improved convmixer and GauGAN
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1