自然语言任务中递归神经网络的加权自动机提取与解释

IF 0.7 4区 数学 Q3 COMPUTER SCIENCE, THEORY & METHODS Journal of Logical and Algebraic Methods in Programming Pub Date : 2023-09-06 DOI:10.1016/j.jlamp.2023.100907
Zeming Wei , Xiyue Zhang , Yihao Zhang , Meng Sun
{"title":"自然语言任务中递归神经网络的加权自动机提取与解释","authors":"Zeming Wei ,&nbsp;Xiyue Zhang ,&nbsp;Yihao Zhang ,&nbsp;Meng Sun","doi":"10.1016/j.jlamp.2023.100907","DOIUrl":null,"url":null,"abstract":"<div><p><span>Recurrent Neural Networks (RNNs) have achieved tremendous success in processing sequential data, yet understanding and analyzing their behaviours remains a significant challenge. To this end, many efforts have been made to extract </span>finite automata<span><span><span> from RNNs, which are more amenable for analysis and explanation. However, existing approaches like exact learning and compositional approaches for model extraction have limitations in either scalability or precision. In this paper, we propose a novel framework of Weighted Finite Automata (WFA) extraction and explanation to tackle the limitations for natural language tasks. First, to address the transition sparsity and context loss problems we identified in WFA extraction for natural language tasks, we propose an empirical method to complement missing rules in the </span>transition diagram, and adjust </span>transition matrices<span><span> to enhance the context-awareness of the WFA. We also propose two data augmentation tactics to track more dynamic behaviours of RNN, which further allows us to improve the extraction precision. Based on the extracted model, we propose an explanation method for RNNs including a </span>word embedding<span> method – Transition Matrix Embeddings (TME) and TME-based task oriented explanation for the target RNN. Our evaluation demonstrates the advantage of our method in extraction precision than existing approaches, and the effectiveness of TME-based explanation method in applications to pretraining and adversarial example generation.</span></span></span></p></div>","PeriodicalId":48797,"journal":{"name":"Journal of Logical and Algebraic Methods in Programming","volume":"136 ","pages":"Article 100907"},"PeriodicalIF":0.7000,"publicationDate":"2023-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Weighted automata extraction and explanation of recurrent neural networks for natural language tasks\",\"authors\":\"Zeming Wei ,&nbsp;Xiyue Zhang ,&nbsp;Yihao Zhang ,&nbsp;Meng Sun\",\"doi\":\"10.1016/j.jlamp.2023.100907\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p><span>Recurrent Neural Networks (RNNs) have achieved tremendous success in processing sequential data, yet understanding and analyzing their behaviours remains a significant challenge. To this end, many efforts have been made to extract </span>finite automata<span><span><span> from RNNs, which are more amenable for analysis and explanation. However, existing approaches like exact learning and compositional approaches for model extraction have limitations in either scalability or precision. In this paper, we propose a novel framework of Weighted Finite Automata (WFA) extraction and explanation to tackle the limitations for natural language tasks. First, to address the transition sparsity and context loss problems we identified in WFA extraction for natural language tasks, we propose an empirical method to complement missing rules in the </span>transition diagram, and adjust </span>transition matrices<span><span> to enhance the context-awareness of the WFA. We also propose two data augmentation tactics to track more dynamic behaviours of RNN, which further allows us to improve the extraction precision. Based on the extracted model, we propose an explanation method for RNNs including a </span>word embedding<span> method – Transition Matrix Embeddings (TME) and TME-based task oriented explanation for the target RNN. Our evaluation demonstrates the advantage of our method in extraction precision than existing approaches, and the effectiveness of TME-based explanation method in applications to pretraining and adversarial example generation.</span></span></span></p></div>\",\"PeriodicalId\":48797,\"journal\":{\"name\":\"Journal of Logical and Algebraic Methods in Programming\",\"volume\":\"136 \",\"pages\":\"Article 100907\"},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2023-09-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Logical and Algebraic Methods in Programming\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2352220823000615\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Logical and Algebraic Methods in Programming","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352220823000615","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

摘要

递归神经网络(RNNs)在处理序列数据方面取得了巨大的成功,但理解和分析其行为仍然是一个重大挑战。为此,人们做出了许多努力,从rnn中提取有限自动机,这更适合于分析和解释。然而,现有的模型提取方法,如精确学习和组合方法,在可扩展性和精度上都有局限性。在本文中,我们提出了一个新的加权有限自动机(WFA)提取和解释框架来解决自然语言任务的局限性。首先,为了解决我们在自然语言任务的WFA提取中发现的转换稀疏性和上下文丢失问题,我们提出了一种经验方法来补充转换图中缺失的规则,并调整转换矩阵以增强WFA的上下文感知。我们还提出了两种数据增强策略来跟踪RNN的更多动态行为,这进一步提高了提取精度。在提取模型的基础上,提出了一种RNN的解释方法,包括词嵌入方法-过渡矩阵嵌入(TME)和基于TME的目标RNN面向任务的解释。我们的评估证明了我们的方法在提取精度上比现有方法的优势,以及基于tme的解释方法在预训练和对抗性示例生成中的应用有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Weighted automata extraction and explanation of recurrent neural networks for natural language tasks

Recurrent Neural Networks (RNNs) have achieved tremendous success in processing sequential data, yet understanding and analyzing their behaviours remains a significant challenge. To this end, many efforts have been made to extract finite automata from RNNs, which are more amenable for analysis and explanation. However, existing approaches like exact learning and compositional approaches for model extraction have limitations in either scalability or precision. In this paper, we propose a novel framework of Weighted Finite Automata (WFA) extraction and explanation to tackle the limitations for natural language tasks. First, to address the transition sparsity and context loss problems we identified in WFA extraction for natural language tasks, we propose an empirical method to complement missing rules in the transition diagram, and adjust transition matrices to enhance the context-awareness of the WFA. We also propose two data augmentation tactics to track more dynamic behaviours of RNN, which further allows us to improve the extraction precision. Based on the extracted model, we propose an explanation method for RNNs including a word embedding method – Transition Matrix Embeddings (TME) and TME-based task oriented explanation for the target RNN. Our evaluation demonstrates the advantage of our method in extraction precision than existing approaches, and the effectiveness of TME-based explanation method in applications to pretraining and adversarial example generation.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Logical and Algebraic Methods in Programming
Journal of Logical and Algebraic Methods in Programming COMPUTER SCIENCE, THEORY & METHODS-LOGIC
CiteScore
2.60
自引率
22.20%
发文量
48
期刊介绍: The Journal of Logical and Algebraic Methods in Programming is an international journal whose aim is to publish high quality, original research papers, survey and review articles, tutorial expositions, and historical studies in the areas of logical and algebraic methods and techniques for guaranteeing correctness and performability of programs and in general of computing systems. All aspects will be covered, especially theory and foundations, implementation issues, and applications involving novel ideas.
期刊最新文献
Editorial Board Generation of algebraic data type values using evolutionary algorithms Logic and Calculi for All on the occasion of Luís Barbosa’s 60th birthday First order Büchi automata and their application to verification of LTL specifications Tuning similarity-based fuzzy logic programs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1