Do LSTMs know about Principle C?

Jeff Mitchell, N. Kazanina, Conor J. Houghton, J. Bowers
{"title":"Do LSTMs know about Principle C?","authors":"Jeff Mitchell, N. Kazanina, Conor J. Houghton, J. Bowers","doi":"10.32470/ccn.2019.1241-0","DOIUrl":null,"url":null,"abstract":"We investigate whether a recurrent network trained on raw text can learn an important syntactic constraint on coreference. A Long Short-Term Memory (LSTM) network that is sensitive to some other syntactic constraints was tested on psycholinguistic materials from two published experiments on coreference. Whereas the participants were sensitive to the Principle C constraint on coreference the LSTM network was not. Our results suggest that, whether as cognitive models of linguistic processes or as engineering solutions in practical applications, recurrent networks may need to be augmented with additional inductive biases to be able to learn models and representations that fully capture the structures of language underlying comprehension.","PeriodicalId":281121,"journal":{"name":"2019 Conference on Cognitive Computational Neuroscience","volume":"39 11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 Conference on Cognitive Computational Neuroscience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32470/ccn.2019.1241-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

We investigate whether a recurrent network trained on raw text can learn an important syntactic constraint on coreference. A Long Short-Term Memory (LSTM) network that is sensitive to some other syntactic constraints was tested on psycholinguistic materials from two published experiments on coreference. Whereas the participants were sensitive to the Principle C constraint on coreference the LSTM network was not. Our results suggest that, whether as cognitive models of linguistic processes or as engineering solutions in practical applications, recurrent networks may need to be augmented with additional inductive biases to be able to learn models and representations that fully capture the structures of language underlying comprehension.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
lstm知道原理C吗?
我们研究了在原始文本上训练的循环网络是否能够学习到一个重要的句法约束。本文利用已发表的两篇关于共同参照的心理语言学实验,对一个对句法约束敏感的长短期记忆(LSTM)网络进行了测试。被试对C原则约束敏感,而LSTM网络则不敏感。我们的研究结果表明,无论是作为语言过程的认知模型,还是作为实际应用中的工程解决方案,循环网络都可能需要增加额外的归纳偏差,以便能够学习模型和表征,充分捕捉理解基础的语言结构。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Narratives as Networks: Predicting Memory from the Structure of Naturalistic Events Subtractive gating improves generalization in working memory tasks Do LSTMs know about Principle C? Unfolding of multisensory inference in the brain and behavior Adversarial Training of Neural Encoding Models on Population Spike Trains
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1