Effective cross-sensor color constancy using a dual-mapping strategy

Shuwei Yue and Minchen Wei
{"title":"Effective cross-sensor color constancy using a dual-mapping strategy","authors":"Shuwei Yue and Minchen Wei","doi":"10.1364/josaa.505814","DOIUrl":null,"url":null,"abstract":"Deep neural networks (DNNs) have been widely used for illuminant estimation, which commonly requires great efforts to collect sensor-specific data. In this paper, we propose a dual-mapping strategy—the DMCC method. It only requires the white points captured by the training and testing sensors under a D65 condition to reconstruct the image and illuminant data, and then maps the reconstructed image into sparse features. These features, together with the reconstructed illuminants, were used to train a lightweight multi-layer perceptron (MLP) model, which can be directly used to estimate the illuminant for the testing sensor. The proposed model was found to have performance comparable to other state-of-the-art methods, based on the three available datasets. Moreover, the smaller number of parameters, faster speed, and not requiring data collection using the testing sensor make it ready for practical deployment. This paper is an extension of Yue and Wei [<i>Color and Imaging Conference</i> (2023)], with more detailed results, analyses, and discussions.","PeriodicalId":501620,"journal":{"name":"Journal of the Optical Society of America A","volume":"29 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Optical Society of America A","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1364/josaa.505814","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep neural networks (DNNs) have been widely used for illuminant estimation, which commonly requires great efforts to collect sensor-specific data. In this paper, we propose a dual-mapping strategy—the DMCC method. It only requires the white points captured by the training and testing sensors under a D65 condition to reconstruct the image and illuminant data, and then maps the reconstructed image into sparse features. These features, together with the reconstructed illuminants, were used to train a lightweight multi-layer perceptron (MLP) model, which can be directly used to estimate the illuminant for the testing sensor. The proposed model was found to have performance comparable to other state-of-the-art methods, based on the three available datasets. Moreover, the smaller number of parameters, faster speed, and not requiring data collection using the testing sensor make it ready for practical deployment. This paper is an extension of Yue and Wei [Color and Imaging Conference (2023)], with more detailed results, analyses, and discussions.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
使用双映射策略实现有效的跨传感器色彩恒定性
深度神经网络(DNN)已被广泛应用于光照度估计,这通常需要花费大量精力收集传感器特定数据。在本文中,我们提出了一种双映射策略--DMCC 方法。它只需要在 D65 条件下由训练和测试传感器捕获的白点来重建图像和照度数据,然后将重建图像映射为稀疏特征。这些特征与重建的光照度一起用于训练轻量级多层感知器(MLP)模型,该模型可直接用于估计测试传感器的光照度。根据现有的三个数据集,发现所提出的模型与其他最先进的方法性能相当。此外,该模型参数数量少、速度快,而且不需要使用测试传感器收集数据,因此可用于实际部署。本文是对 Yue 和 Wei [Color and Imaging Conference (2023)]的扩展,包含更详细的结果、分析和讨论。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Moiré effect in combined planar and curved objects Multimodal segmentation of dynamic subcellular features using quantitative phase imaging and FRET-based sensors [Invited] Generation of polarization and coherence non-separable states in twisted partially coherent vector light Structural information awareness-based regularization model for infrared image stripe noise removal On simulating light diffraction by layered structures with multiple wedges
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1