Syntactic bootstrapping is the hypothesis that learners can use the preliminary syntactic structure of a sentence to identify and characterise the meanings of novel verbs. Previous work has shown that syntactic bootstrapping can begin using only a few seed nouns (Connor et al., 2010; Connor et al., 2012). Here, we relax their key assumption: rather than training the model over the entire corpus at once (batch mode), we train the model incrementally, thus more realistically simulating a human learner. We also improve on the verb prediction method by incorporating the assumption that verb assignments are stable over time. We show that, given a high enough number of seed nouns (around 30), an incremental model achieves similar performance to the batch model. We also find that the number of seed nouns shown to be sufficient in the previous work is not sufficient under the more realistic incremental model. The results demonstrate that adopting more realistic assumptions about the early stages of language acquisition can provide new insights without undermining performance.
句法自举是一种假设,认为学习者可以利用句子的初步句法结构来识别和表征新动词的意义。先前的研究表明,句法引导可以从几个种子名词开始(Connor et al., 2010;Connor et al., 2012)。在这里,我们放松了他们的关键假设:我们不是一次在整个语料库上训练模型(批处理模式),而是增量地训练模型,从而更真实地模拟人类学习者。我们还改进了动词预测方法,纳入了动词分配随时间稳定的假设。我们证明,给定足够多的种子名词(大约30个),增量模型可以实现与批处理模型相似的性能。我们还发现,在更现实的增量模型下,在之前的工作中显示足够的种子名词数量并不足够。研究结果表明,在语言习得的早期阶段采用更现实的假设可以在不影响表现的情况下提供新的见解。
{"title":"An incremental model of syntactic bootstrapping","authors":"Christos Christodoulopoulos, D. Roth, C. Fisher","doi":"10.18653/v1/W16-1906","DOIUrl":"https://doi.org/10.18653/v1/W16-1906","url":null,"abstract":"Syntactic bootstrapping is the hypothesis that learners can use the preliminary syntactic structure of a sentence to identify and characterise the meanings of novel verbs. Previous work has shown that syntactic bootstrapping can begin using only a few seed nouns (Connor et al., 2010; Connor et al., 2012). Here, we relax their key assumption: rather than training the model over the entire corpus at once (batch mode), we train the model incrementally, thus more realistically simulating a human learner. We also improve on the verb prediction method by incorporating the assumption that verb assignments are stable over time. We show that, given a high enough number of seed nouns (around 30), an incremental model achieves similar performance to the batch model. We also find that the number of seed nouns shown to be sufficient in the previous work is not sufficient under the more realistic incremental model. The results demonstrate that adopting more realistic assumptions about the early stages of language acquisition can provide new insights without undermining performance.","PeriodicalId":286561,"journal":{"name":"Proceedings of the 7th Workshop on Cognitive Aspects of Computational\n Language Learning","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133317276","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}