Large Language Models and the Argument from the Poverty of the Stimulus

IF 1.6 1区 文学 0 LANGUAGE & LINGUISTICS Linguistic Inquiry Pub Date : 2024-04-01 DOI:10.1162/ling_a_00533
Nur Lan, Emmanuel Chemla, Roni Katzir
{"title":"Large Language Models and the Argument from the Poverty of the Stimulus","authors":"Nur Lan, Emmanuel Chemla, Roni Katzir","doi":"10.1162/ling_a_00533","DOIUrl":null,"url":null,"abstract":"How much of our linguistic knowledge is innate? According to much of theoretical linguistics, a fair amount. One of the best-known (and most contested) kinds of evidence for a large innate endowment is the so-called argument from the poverty of the stimulus (APS). In a nutshell, an APS obtains when human learners systematically make inductive leaps that are not warranted by the linguistic evidence. A weakness of the APS has been that it is very hard to assess what is warranted by the linguistic evidence. Current Artificial Neural Networks appear to offer a handle on this challenge, and a growing literature over the past few years has started to explore the potential implications of such models to questions of innateness. We focus here on Wilcox et al. (2023), who use several different networks to examine the available evidence as it pertains to wh-movement, including island constraints. They conclude that the (presumably linguistically-neutral) networks acquire an adequate knowledge of wh-movement, thus undermining an APS in this domain. We examine the evidence further, looking in particular at parasitic gaps and across-the-board movement, and argue that current networks do not, in fact, succeed in acquiring or even adequately approximating wh-movement from training corpora that roughly correspond in size to the linguistic input that children receive. We also show that the performance of one of the models improves considerably when the training data are artificially enriched with instances of parasitic gaps and across-the-board movement. This finding suggests, albeit tentatively, that the failure of the networks when trained on natural, unenriched corpora is due to the insufficient richness of the linguistic input, thus supporting the APS.","PeriodicalId":48044,"journal":{"name":"Linguistic Inquiry","volume":"285 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Linguistic Inquiry","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1162/ling_a_00533","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"LANGUAGE & LINGUISTICS","Score":null,"Total":0}
引用次数: 0

Abstract

How much of our linguistic knowledge is innate? According to much of theoretical linguistics, a fair amount. One of the best-known (and most contested) kinds of evidence for a large innate endowment is the so-called argument from the poverty of the stimulus (APS). In a nutshell, an APS obtains when human learners systematically make inductive leaps that are not warranted by the linguistic evidence. A weakness of the APS has been that it is very hard to assess what is warranted by the linguistic evidence. Current Artificial Neural Networks appear to offer a handle on this challenge, and a growing literature over the past few years has started to explore the potential implications of such models to questions of innateness. We focus here on Wilcox et al. (2023), who use several different networks to examine the available evidence as it pertains to wh-movement, including island constraints. They conclude that the (presumably linguistically-neutral) networks acquire an adequate knowledge of wh-movement, thus undermining an APS in this domain. We examine the evidence further, looking in particular at parasitic gaps and across-the-board movement, and argue that current networks do not, in fact, succeed in acquiring or even adequately approximating wh-movement from training corpora that roughly correspond in size to the linguistic input that children receive. We also show that the performance of one of the models improves considerably when the training data are artificially enriched with instances of parasitic gaps and across-the-board movement. This finding suggests, albeit tentatively, that the failure of the networks when trained on natural, unenriched corpora is due to the insufficient richness of the linguistic input, thus supporting the APS.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
大型语言模型与刺激贫困论证
我们的语言知识有多少是与生俱来的?理论语言学认为,相当多。所谓的 "刺激贫乏论证"(APS)是最著名的(也是最有争议的)先天禀赋证据之一。一言以蔽之,当人类学习者系统性地做出与语言证据不符的归纳跳跃时,就会出现 APS。APS 的一个弱点是很难评估什么是语言证据所支持的。目前的人工神经网络似乎可以应对这一挑战,过去几年中,越来越多的文献开始探索此类模型对先天性问题的潜在影响。在此,我们重点讨论 Wilcox 等人(2023 年)的研究,他们使用了几种不同的网络来检查现有的证据,因为这些证据与包括岛屿限制在内的wh-movement 有关。他们得出的结论是,这些网络(大概是语言中性的)获得了关于语词移动的充分知识,从而削弱了这一领域的 APS。我们进一步研究了这些证据,特别是寄生间隙和全面移动,并认为目前的网络事实上并不能从训练语料库中成功地获得或甚至充分地接近wh-movement,而训练语料库的大小与儿童接受的语言输入大致相当。我们还表明,如果人为地在训练数据中加入寄生间隙和全面移动的实例,其中一个模型的性能就会大大提高。这一发现表明(尽管只是初步的),在自然的、未丰富化的语料库中训练的网络之所以失败,是由于语言输入的丰富性不足,从而支持了 APS。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Linguistic Inquiry
Linguistic Inquiry Multiple-
CiteScore
2.50
自引率
12.50%
发文量
54
期刊介绍: Linguistic Inquiry leads the field in research on current topics in linguistics. This key resource explores new theoretical developments based on the latest international scholarship, capturing the excitement of contemporary debate in full-scale articles as well as shorter contributions (Squibs and Discussion) and more extensive commentary (Remarks and Replies).
期刊最新文献
Inverse Linking and Extraposition VP-Preposing and Constituency “Paradox” Using Computational Models to Test Syntactic Learnability Applicative Recursion and Nominal Licensing More on (the Lack of) Reconstruction in English Tough-Constructions
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1