Humans Learn Language from Situated Communicative Interactions. What about Machines?

IF 9.3 2区 计算机科学 Computational Linguistics Pub Date : 2024-08-02 DOI:10.1162/coli_a_00534
Katrien Beuls, Paul Van Eecke
{"title":"Humans Learn Language from Situated Communicative Interactions. What about Machines?","authors":"Katrien Beuls, Paul Van Eecke","doi":"10.1162/coli_a_00534","DOIUrl":null,"url":null,"abstract":"Humans acquire their native languages by taking part in communicative interactions with their caregivers. These interactions are meaningful, intentional, and situated in their everyday environment. The situated and communicative nature of the interactions is essential to the language acquisition process, as language learners depend on clues provided by the communicative environment to make sense of the utterances they perceive. As such, the linguistic knowledge they build up is rooted in linguistic forms, their meaning, and their communicative function. When it comes to machines, the situated, communicative, and interactional aspects of language learning are often passed over. This applies in particular to today’s large language models (LLMs), where the input is predominantly text-based, and where the distribution of character groups or words serves as a basis for modeling the meaning of linguistic expressions. In this article, we argue that this design choice lies at the root of a number of important limitations, in particular regarding the data hungriness of the models, their limited ability to perform human-like logical and pragmatic reasoning, and their susceptibility to biases. At the same time, we make a case for an alternative approach that models how artificial agents can acquire linguistic structures by participating in situated communicative interactions. Through a selection of experiments, we show how the linguistic knowledge that is captured in the resulting models is of a fundamentally different nature than the knowledge captured by LLMs and argue that this change of perspective provides a promising path towards more human-like language processing in machines.","PeriodicalId":49089,"journal":{"name":"Computational Linguistics","volume":null,"pages":null},"PeriodicalIF":9.3000,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Linguistics","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1162/coli_a_00534","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Humans acquire their native languages by taking part in communicative interactions with their caregivers. These interactions are meaningful, intentional, and situated in their everyday environment. The situated and communicative nature of the interactions is essential to the language acquisition process, as language learners depend on clues provided by the communicative environment to make sense of the utterances they perceive. As such, the linguistic knowledge they build up is rooted in linguistic forms, their meaning, and their communicative function. When it comes to machines, the situated, communicative, and interactional aspects of language learning are often passed over. This applies in particular to today’s large language models (LLMs), where the input is predominantly text-based, and where the distribution of character groups or words serves as a basis for modeling the meaning of linguistic expressions. In this article, we argue that this design choice lies at the root of a number of important limitations, in particular regarding the data hungriness of the models, their limited ability to perform human-like logical and pragmatic reasoning, and their susceptibility to biases. At the same time, we make a case for an alternative approach that models how artificial agents can acquire linguistic structures by participating in situated communicative interactions. Through a selection of experiments, we show how the linguistic knowledge that is captured in the resulting models is of a fundamentally different nature than the knowledge captured by LLMs and argue that this change of perspective provides a promising path towards more human-like language processing in machines.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
人类从情景交流互动中学习语言。那么机器呢?
人类是通过与照料者的交流互动来获得母语的。这些互动是有意义的、有意的,而且是在他们的日常生活环境中进行的。互动的情景性和交际性对语言习得过程至关重要,因为语言学习者依靠交际环境提供的线索来理解他们所感知的话语。因此,他们积累的语言知识植根于语言形式、语言意义及其交际功能。说到机器,语言学习的情景、交际和互动方面往往被忽略了。这一点尤其适用于当今的大型语言模型(LLM),在这些模型中,输入主要是基于文本的,字符组或单词的分布是语言表达意义建模的基础。在本文中,我们认为这种设计选择是一系列重要局限性的根源所在,尤其是在模型的数据饥渴性、执行类人逻辑和语用推理的能力有限以及易受偏见影响等方面。与此同时,我们提出了另一种方法,即模拟人工代理如何通过参与情景交流互动来获得语言结构。通过一系列实验,我们展示了由此产生的模型所捕捉到的语言知识与 LLMs 所捕捉到的知识有着本质上的不同,并认为这种视角的改变为机器提供了一条通往更像人类的语言处理之路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computational Linguistics
Computational Linguistics Computer Science-Artificial Intelligence
自引率
0.00%
发文量
45
期刊介绍: Computational Linguistics is the longest-running publication devoted exclusively to the computational and mathematical properties of language and the design and analysis of natural language processing systems. This highly regarded quarterly offers university and industry linguists, computational linguists, artificial intelligence and machine learning investigators, cognitive scientists, speech specialists, and philosophers the latest information about the computational aspects of all the facets of research on language.
期刊最新文献
Dotless Arabic text for Natural Language Processing Humans Learn Language from Situated Communicative Interactions. What about Machines? Exploring temporal sensitivity in the brain using multi-timescale language models: an EEG decoding study Meaning beyond lexicality: Capturing Pseudoword Definitions with Language Models Perception of Phonological Assimilation by Neural Speech Recognition Models
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1