Xia Lei, Jia-Jiang Lin, Xiong-Lin Luo, Yongkai Fan
{"title":"用辛伴随法解释深度残差网络的预测","authors":"Xia Lei, Jia-Jiang Lin, Xiong-Lin Luo, Yongkai Fan","doi":"10.2298/csis230310047l","DOIUrl":null,"url":null,"abstract":"Understanding deep residual networks (ResNets) decisions are receiving much attention as a way to ensure their security and reliability. Recent research, however, lacks theoretical analysis to guarantee the faithfulness of explanations and could produce an unreliable explanation. In order to explain ResNets predictions, we suggest a provably faithful explanation for ResNet using a surrogate explainable model, a neural ordinary differential equation network (Neural ODE). First, ResNets are proved to converge to a Neural ODE and the Neural ODE is regarded as a surrogate model to explain the decision-making attribution of the ResNets. And then the decision feature and the explanation map of inputs belonging to the target class for Neural ODE are generated via the symplectic adjoint method. Finally, we prove that the explanations of Neural ODE can be sufficiently approximate to ResNet. Experiments show that the proposed explanation method has higher faithfulness with lower computational cost than other explanation approaches and it is effective for troubleshooting and optimizing a model by the explanation.","PeriodicalId":50636,"journal":{"name":"Computer Science and Information Systems","volume":"1 1","pages":"0"},"PeriodicalIF":1.2000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Explaining deep residual networks predictions with symplectic adjoint method\",\"authors\":\"Xia Lei, Jia-Jiang Lin, Xiong-Lin Luo, Yongkai Fan\",\"doi\":\"10.2298/csis230310047l\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Understanding deep residual networks (ResNets) decisions are receiving much attention as a way to ensure their security and reliability. Recent research, however, lacks theoretical analysis to guarantee the faithfulness of explanations and could produce an unreliable explanation. In order to explain ResNets predictions, we suggest a provably faithful explanation for ResNet using a surrogate explainable model, a neural ordinary differential equation network (Neural ODE). First, ResNets are proved to converge to a Neural ODE and the Neural ODE is regarded as a surrogate model to explain the decision-making attribution of the ResNets. And then the decision feature and the explanation map of inputs belonging to the target class for Neural ODE are generated via the symplectic adjoint method. Finally, we prove that the explanations of Neural ODE can be sufficiently approximate to ResNet. Experiments show that the proposed explanation method has higher faithfulness with lower computational cost than other explanation approaches and it is effective for troubleshooting and optimizing a model by the explanation.\",\"PeriodicalId\":50636,\"journal\":{\"name\":\"Computer Science and Information Systems\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Science and Information Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2298/csis230310047l\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Science and Information Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2298/csis230310047l","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Explaining deep residual networks predictions with symplectic adjoint method
Understanding deep residual networks (ResNets) decisions are receiving much attention as a way to ensure their security and reliability. Recent research, however, lacks theoretical analysis to guarantee the faithfulness of explanations and could produce an unreliable explanation. In order to explain ResNets predictions, we suggest a provably faithful explanation for ResNet using a surrogate explainable model, a neural ordinary differential equation network (Neural ODE). First, ResNets are proved to converge to a Neural ODE and the Neural ODE is regarded as a surrogate model to explain the decision-making attribution of the ResNets. And then the decision feature and the explanation map of inputs belonging to the target class for Neural ODE are generated via the symplectic adjoint method. Finally, we prove that the explanations of Neural ODE can be sufficiently approximate to ResNet. Experiments show that the proposed explanation method has higher faithfulness with lower computational cost than other explanation approaches and it is effective for troubleshooting and optimizing a model by the explanation.
期刊介绍:
About the journal
Home page
Contact information
Aims and scope
Indexing information
Editorial policies
ComSIS consortium
Journal boards
Managing board
For authors
Information for contributors
Paper submission
Article submission through OJS
Copyright transfer form
Download section
For readers
Forthcoming articles
Current issue
Archive
Subscription
For reviewers
View and review submissions
News
Journal''s Facebook page
Call for special issue
New issue notification
Aims and scope
Computer Science and Information Systems (ComSIS) is an international refereed journal, published in Serbia. The objective of ComSIS is to communicate important research and development results in the areas of computer science, software engineering, and information systems.