Use of the URREF towards Information Fusion Accountability Evaluation

E. Blasch, J. Pieter de Villiers, Gregor Pavin, A. Jousselme, P. Costa, Kathryn B. Laskey, J. Ziegler
{"title":"Use of the URREF towards Information Fusion Accountability Evaluation","authors":"E. Blasch, J. Pieter de Villiers, Gregor Pavin, A. Jousselme, P. Costa, Kathryn B. Laskey, J. Ziegler","doi":"10.23919/fusion49465.2021.9626847","DOIUrl":null,"url":null,"abstract":"The EXCITE (EXplainability Capability Information Testing Evaluation) approach assesses information fusion interpretability, explainability, and accountability for uncertainty analysis. Amongst many data and information fusion techniques is the need to understand the information fusion system capability for the intended application. While many approaches for data fusion support uncertainty reduction from measured data, there are other contributing factors such as data source credibility, knowledge completeness, multiresolution, and problem alignment. To facilitate the alignment of the data fusion approach to the user’s intended action, there is a need towards a representation of the uncertainty. The paper highlights the approach to leverage recent research efforts in interpretability as methods of data handing in the Uncertainty Representation and Reasoning Evaluation Framework (URREF) while also proposing explainability and accountability as a representation criterion. Accountability is closely associated with the selected decision and the outcome which has these four attributes: amount of data towards the result, distance score of decision selection, accuracy/credibility/timeliness of results, and risk analysis. The risk analysis includes: verifiability, observability, liability, transparency, attributability, and remediabilty. Results are demonstrated on notional example.","PeriodicalId":226850,"journal":{"name":"2021 IEEE 24th International Conference on Information Fusion (FUSION)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 24th International Conference on Information Fusion (FUSION)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/fusion49465.2021.9626847","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

The EXCITE (EXplainability Capability Information Testing Evaluation) approach assesses information fusion interpretability, explainability, and accountability for uncertainty analysis. Amongst many data and information fusion techniques is the need to understand the information fusion system capability for the intended application. While many approaches for data fusion support uncertainty reduction from measured data, there are other contributing factors such as data source credibility, knowledge completeness, multiresolution, and problem alignment. To facilitate the alignment of the data fusion approach to the user’s intended action, there is a need towards a representation of the uncertainty. The paper highlights the approach to leverage recent research efforts in interpretability as methods of data handing in the Uncertainty Representation and Reasoning Evaluation Framework (URREF) while also proposing explainability and accountability as a representation criterion. Accountability is closely associated with the selected decision and the outcome which has these four attributes: amount of data towards the result, distance score of decision selection, accuracy/credibility/timeliness of results, and risk analysis. The risk analysis includes: verifiability, observability, liability, transparency, attributability, and remediabilty. Results are demonstrated on notional example.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
URREF在信息融合问责评估中的应用
EXCITE(可解释性能力信息测试评估)方法评估信息融合的可解释性、可解释性和不确定性分析的责任。在许多数据和信息融合技术中,需要了解预期应用的信息融合系统能力。虽然许多数据融合方法支持从测量数据中减少不确定性,但还有其他因素,如数据源可信度、知识完整性、多分辨率和问题一致性。为了使数据融合方法与用户的预期操作保持一致,需要对不确定性进行表示。本文强调了在不确定性表示和推理评估框架(URREF)中利用可解释性作为数据处理方法的最新研究成果的方法,同时也提出了可解释性和问责制作为表示标准的方法。问责制与选择的决策和结果密切相关,结果具有以下四个属性:结果的数据量,决策选择的距离得分,结果的准确性/可信度/及时性以及风险分析。风险分析包括:可验证性、可观察性、责任、透明度、归因性和可补救性。结果在概念算例上得到了验证。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Impact of Georegistration Accuracy on Wide Area Motion Imagery Object Detection and Tracking Posterior Cramér-Rao Bounds for Tracking Intermittently Visible Targets in Clutter Monocular 3D Multi-Object Tracking with an EKF Approach for Long-Term Stable Tracks Resilient Collaborative All-source Navigation Symmetric Star-convex Shape Tracking With Wishart Filter
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1