Coq中神经网络分段仿射激活函数的形式化

A. Aleksandrov, Kim Völlinger
{"title":"Coq中神经网络分段仿射激活函数的形式化","authors":"A. Aleksandrov, Kim Völlinger","doi":"10.48550/arXiv.2301.12893","DOIUrl":null,"url":null,"abstract":"Verification of neural networks relies on activation functions being piecewise affine (pwa) -- enabling an encoding of the verification problem for theorem provers. In this paper, we present the first formalization of pwa activation functions for an interactive theorem prover tailored to verifying neural networks within Coq using the library Coquelicot for real analysis. As a proof-of-concept, we construct the popular pwa activation function ReLU. We integrate our formalization into a Coq model of neural networks, and devise a verified transformation from a neural network N to a pwa function representing N by composing pwa functions that we construct for each layer. This representation enables encodings for proof automation, e.g. Coq's tactic lra -- a decision procedure for linear real arithmetic. Further, our formalization paves the way for integrating Coq in frameworks of neural network verification as a fallback prover when automated proving fails.","PeriodicalId":436677,"journal":{"name":"NASA Formal Methods","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Formalizing Piecewise Affine Activation Functions of Neural Networks in Coq\",\"authors\":\"A. Aleksandrov, Kim Völlinger\",\"doi\":\"10.48550/arXiv.2301.12893\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Verification of neural networks relies on activation functions being piecewise affine (pwa) -- enabling an encoding of the verification problem for theorem provers. In this paper, we present the first formalization of pwa activation functions for an interactive theorem prover tailored to verifying neural networks within Coq using the library Coquelicot for real analysis. As a proof-of-concept, we construct the popular pwa activation function ReLU. We integrate our formalization into a Coq model of neural networks, and devise a verified transformation from a neural network N to a pwa function representing N by composing pwa functions that we construct for each layer. This representation enables encodings for proof automation, e.g. Coq's tactic lra -- a decision procedure for linear real arithmetic. Further, our formalization paves the way for integrating Coq in frameworks of neural network verification as a fallback prover when automated proving fails.\",\"PeriodicalId\":436677,\"journal\":{\"name\":\"NASA Formal Methods\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"NASA Formal Methods\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.48550/arXiv.2301.12893\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"NASA Formal Methods","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2301.12893","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

神经网络的验证依赖于激活函数是分段仿射的(pwa),这使得定理证明者能够对验证问题进行编码。在本文中,我们提出了一个交互式定理证明器的pwa激活函数的第一个形式化,该证明器专门用于验证Coq中的神经网络,使用库Coquelicot进行实际分析。作为概念验证,我们构造了流行的pwa激活函数ReLU。我们将我们的形式化集成到神经网络的Coq模型中,并通过组合我们为每层构建的pwa函数,设计了从神经网络N到表示N的pwa函数的验证转换。这种表示使证明自动化的编码成为可能,例如Coq的策略lra——线性实数算法的决策过程。此外,我们的形式化为在自动证明失败时将Coq集成到神经网络验证框架中作为后备证明器铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Formalizing Piecewise Affine Activation Functions of Neural Networks in Coq
Verification of neural networks relies on activation functions being piecewise affine (pwa) -- enabling an encoding of the verification problem for theorem provers. In this paper, we present the first formalization of pwa activation functions for an interactive theorem prover tailored to verifying neural networks within Coq using the library Coquelicot for real analysis. As a proof-of-concept, we construct the popular pwa activation function ReLU. We integrate our formalization into a Coq model of neural networks, and devise a verified transformation from a neural network N to a pwa function representing N by composing pwa functions that we construct for each layer. This representation enables encodings for proof automation, e.g. Coq's tactic lra -- a decision procedure for linear real arithmetic. Further, our formalization paves the way for integrating Coq in frameworks of neural network verification as a fallback prover when automated proving fails.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Multi-Objective Task Assignment and Multiagent Planning with Hybrid GPU-CPU Acceleration A Linear Weight Transfer Rule for Local Search Automata-Based Software Model Checking of Hyperproperties Strategy Synthesis in Markov Decision Processes Under Limited Sampling Access Formalizing Piecewise Affine Activation Functions of Neural Networks in Coq
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1