Twin neural network regression

Applied AI letters Pub Date : 2022-10-04 DOI:10.1002/ail2.78
Sebastian Johann Wetzel, Kevin Ryczko, Roger Gordon Melko, Isaac Tamblyn
{"title":"Twin neural network regression","authors":"Sebastian Johann Wetzel,&nbsp;Kevin Ryczko,&nbsp;Roger Gordon Melko,&nbsp;Isaac Tamblyn","doi":"10.1002/ail2.78","DOIUrl":null,"url":null,"abstract":"<p>We introduce twin neural network regression (TNNR). This method predicts differences between the target values of two different data points rather than the targets themselves. The solution of a traditional regression problem is then obtained by averaging over an ensemble of all predicted differences between the targets of an unseen data point and all training data points. Whereas ensembles are normally costly to produce, TNNR intrinsically creates an ensemble of predictions of twice the size of the training set while only training a single neural network. Since ensembles have been shown to be more accurate than single models this property naturally transfers to TNNR. We show that TNNs are able to compete or yield more accurate predictions for different data sets, compared with other state-of-the-art methods. Furthermore, TNNR is constrained by self-consistency conditions. We find that the violation of these conditions provides a signal for the prediction uncertainty.</p>","PeriodicalId":72253,"journal":{"name":"Applied AI letters","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/ail2.78","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied AI letters","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/ail2.78","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

We introduce twin neural network regression (TNNR). This method predicts differences between the target values of two different data points rather than the targets themselves. The solution of a traditional regression problem is then obtained by averaging over an ensemble of all predicted differences between the targets of an unseen data point and all training data points. Whereas ensembles are normally costly to produce, TNNR intrinsically creates an ensemble of predictions of twice the size of the training set while only training a single neural network. Since ensembles have been shown to be more accurate than single models this property naturally transfers to TNNR. We show that TNNs are able to compete or yield more accurate predictions for different data sets, compared with other state-of-the-art methods. Furthermore, TNNR is constrained by self-consistency conditions. We find that the violation of these conditions provides a signal for the prediction uncertainty.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
双神经网络回归
我们介绍了孪生神经网络回归(TNNR)。这种方法预测的是两个不同数据点目标值之间的差异,而不是目标值本身。传统回归问题的解决方案是通过对未见数据点和所有训练数据点的目标之间的所有预测差异的集合进行平均来获得的。虽然集成通常是昂贵的,但TNNR本质上创造了一个两倍于训练集大小的预测集成,而只训练一个神经网络。由于综合模型已被证明比单一模型更精确,这种特性自然地转移到TNNR。我们表明,与其他最先进的方法相比,tnn能够在不同的数据集上竞争或产生更准确的预测。此外,TNNR受自洽条件的约束。我们发现这些条件的违背为预测的不确定性提供了一个信号。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Issue Information Fine-Tuned Pretrained Transformer for Amharic News Headline Generation TL-GNN: Android Malware Detection Using Transfer Learning Issue Information Building Text and Speech Benchmark Datasets and Models for Low-Resourced East African Languages: Experiences and Lessons
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1