{"title":"渐变句法触发:梯度参数假说","authors":"Katherine Howitt, Soumik Dey, W. G. Sakas","doi":"10.1080/10489223.2020.1803329","DOIUrl":null,"url":null,"abstract":"ABSTRACT In this article, we propose a reconceptualization of the principles and parameters (P&P) framework. We argue that in lieu of discrete parameter values, a parameter value exists on a gradient plane that encodes a learner’s confidence that a particular parametric structure licenses the utterances in the learner’s linguistic input. Crucially, this gradient parameter hypothesis obviates the need for default parameter values. Default parameter values can be put to use effectively from the perspective of linguistic learnability but are lacking in terms of empirical and theoretical consistency. We present findings from a computational implementation of a gradient P&P learner. The findings suggest that the gradient parameter hypothesis provides the basis for a viable alternative to existing computational models of language acquisition in the classic P&P paradigm. We close with a brief discussion of how a gradient parameter space offers a path to address shortcomings that have been attributed to the P&P framework.","PeriodicalId":46920,"journal":{"name":"Language Acquisition","volume":"28 1","pages":"65 - 96"},"PeriodicalIF":1.3000,"publicationDate":"2020-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10489223.2020.1803329","citationCount":"1","resultStr":"{\"title\":\"Gradual syntactic triggering: The gradient parameter hypothesis\",\"authors\":\"Katherine Howitt, Soumik Dey, W. G. Sakas\",\"doi\":\"10.1080/10489223.2020.1803329\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT In this article, we propose a reconceptualization of the principles and parameters (P&P) framework. We argue that in lieu of discrete parameter values, a parameter value exists on a gradient plane that encodes a learner’s confidence that a particular parametric structure licenses the utterances in the learner’s linguistic input. Crucially, this gradient parameter hypothesis obviates the need for default parameter values. Default parameter values can be put to use effectively from the perspective of linguistic learnability but are lacking in terms of empirical and theoretical consistency. We present findings from a computational implementation of a gradient P&P learner. The findings suggest that the gradient parameter hypothesis provides the basis for a viable alternative to existing computational models of language acquisition in the classic P&P paradigm. We close with a brief discussion of how a gradient parameter space offers a path to address shortcomings that have been attributed to the P&P framework.\",\"PeriodicalId\":46920,\"journal\":{\"name\":\"Language Acquisition\",\"volume\":\"28 1\",\"pages\":\"65 - 96\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2020-10-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/10489223.2020.1803329\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Language Acquisition\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://doi.org/10.1080/10489223.2020.1803329\",\"RegionNum\":3,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"0\",\"JCRName\":\"LANGUAGE & LINGUISTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Language Acquisition","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1080/10489223.2020.1803329","RegionNum":3,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"LANGUAGE & LINGUISTICS","Score":null,"Total":0}
Gradual syntactic triggering: The gradient parameter hypothesis
ABSTRACT In this article, we propose a reconceptualization of the principles and parameters (P&P) framework. We argue that in lieu of discrete parameter values, a parameter value exists on a gradient plane that encodes a learner’s confidence that a particular parametric structure licenses the utterances in the learner’s linguistic input. Crucially, this gradient parameter hypothesis obviates the need for default parameter values. Default parameter values can be put to use effectively from the perspective of linguistic learnability but are lacking in terms of empirical and theoretical consistency. We present findings from a computational implementation of a gradient P&P learner. The findings suggest that the gradient parameter hypothesis provides the basis for a viable alternative to existing computational models of language acquisition in the classic P&P paradigm. We close with a brief discussion of how a gradient parameter space offers a path to address shortcomings that have been attributed to the P&P framework.
期刊介绍:
The research published in Language Acquisition: A Journal of Developmental Linguistics makes a clear contribution to linguistic theory by increasing our understanding of how language is acquired. The journal focuses on the acquisition of syntax, semantics, phonology, and morphology, and considers theoretical, experimental, and computational perspectives. Coverage includes solutions to the logical problem of language acquisition, as it arises for particular grammatical proposals; discussion of acquisition data relevant to current linguistic questions; and perspectives derived from theory-driven studies of second language acquisition, language-impaired speakers, and other domains of cognition.