{"title":"组合结构、图式结构和聚合结构的相互作用","authors":"Vsevolod Kapatsinski","doi":"10.7551/mitpress/9780262037860.003.0009","DOIUrl":null,"url":null,"abstract":"This chapter is a step towards developing an associationist framework for an account of productive morphology. Specifically, the aim is to address the paradigm cell filling problem, how speakers produce novel forms of words they know, often studied using elicited production. Learning is assumed to follow the Rescorla-Wagner rule. The model is applied to miniature artificial language learning data from several experiments by the author. Paradigmatic and syntagmatic associations and an operation, copying of an activated memory representation into the production plan, are argued to be necessary to account for the full pattern of results. Furthermore, learning rate must be low enough for the model not to fall prey to accidentally exceptionless generalizations. At these learning rates, an error-driven model closely resembles a Hebbian model. Limitations of the model are identified, including the use of the strict teacher signal in the Rescorla-Wagner learning rule.","PeriodicalId":142675,"journal":{"name":"Changing Minds Changing Tools","volume":"150 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Interplay of Syntagmatic, Schematic, and Paradigmatic Structure\",\"authors\":\"Vsevolod Kapatsinski\",\"doi\":\"10.7551/mitpress/9780262037860.003.0009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This chapter is a step towards developing an associationist framework for an account of productive morphology. Specifically, the aim is to address the paradigm cell filling problem, how speakers produce novel forms of words they know, often studied using elicited production. Learning is assumed to follow the Rescorla-Wagner rule. The model is applied to miniature artificial language learning data from several experiments by the author. Paradigmatic and syntagmatic associations and an operation, copying of an activated memory representation into the production plan, are argued to be necessary to account for the full pattern of results. Furthermore, learning rate must be low enough for the model not to fall prey to accidentally exceptionless generalizations. At these learning rates, an error-driven model closely resembles a Hebbian model. Limitations of the model are identified, including the use of the strict teacher signal in the Rescorla-Wagner learning rule.\",\"PeriodicalId\":142675,\"journal\":{\"name\":\"Changing Minds Changing Tools\",\"volume\":\"150 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-07-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Changing Minds Changing Tools\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.7551/mitpress/9780262037860.003.0009\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Changing Minds Changing Tools","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.7551/mitpress/9780262037860.003.0009","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The Interplay of Syntagmatic, Schematic, and Paradigmatic Structure
This chapter is a step towards developing an associationist framework for an account of productive morphology. Specifically, the aim is to address the paradigm cell filling problem, how speakers produce novel forms of words they know, often studied using elicited production. Learning is assumed to follow the Rescorla-Wagner rule. The model is applied to miniature artificial language learning data from several experiments by the author. Paradigmatic and syntagmatic associations and an operation, copying of an activated memory representation into the production plan, are argued to be necessary to account for the full pattern of results. Furthermore, learning rate must be low enough for the model not to fall prey to accidentally exceptionless generalizations. At these learning rates, an error-driven model closely resembles a Hebbian model. Limitations of the model are identified, including the use of the strict teacher signal in the Rescorla-Wagner learning rule.