{"title":"意义的几何与动力》。","authors":"Peter Gärdenfors","doi":"10.1111/tops.12767","DOIUrl":null,"url":null,"abstract":"<p><p>An enigma for human languages is that children learn to understand words in their mother tongue extremely fast. The cognitive sciences have not been able to fully understand the mechanisms behind this highly efficient learning process. In order to provide at least a partial answer to this problem, I have developed a cognitive model of the semantics of natural language in terms of conceptual spaces. I present a background to conceptual spaces and provide a brief summary of their main features, in particular how it handles learning of concepts. I then apply the model to give a geometric account of the semantics of different word classes. In particular, I propose a \"single-domain hypotheses\" for the semantics of all word classes except nouns. These hypotheses provide a partial answer to the enigma of how words are learned. Next, a dynamic cognitive model of events is introduced that replaces and extends the function of thematic roles. I apply it to analyze the meanings of different kinds of verbs. I argue that the model also explains some aspects of syntactic structure. In particular, I propose that a sentence typically refers to an event. Some further applications of conceptual spaces are briefly presented.</p>","PeriodicalId":47822,"journal":{"name":"Topics in Cognitive Science","volume":" ","pages":""},"PeriodicalIF":2.9000,"publicationDate":"2024-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Geometry and Dynamics of Meaning.\",\"authors\":\"Peter Gärdenfors\",\"doi\":\"10.1111/tops.12767\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>An enigma for human languages is that children learn to understand words in their mother tongue extremely fast. The cognitive sciences have not been able to fully understand the mechanisms behind this highly efficient learning process. In order to provide at least a partial answer to this problem, I have developed a cognitive model of the semantics of natural language in terms of conceptual spaces. I present a background to conceptual spaces and provide a brief summary of their main features, in particular how it handles learning of concepts. I then apply the model to give a geometric account of the semantics of different word classes. In particular, I propose a \\\"single-domain hypotheses\\\" for the semantics of all word classes except nouns. These hypotheses provide a partial answer to the enigma of how words are learned. Next, a dynamic cognitive model of events is introduced that replaces and extends the function of thematic roles. I apply it to analyze the meanings of different kinds of verbs. I argue that the model also explains some aspects of syntactic structure. In particular, I propose that a sentence typically refers to an event. Some further applications of conceptual spaces are briefly presented.</p>\",\"PeriodicalId\":47822,\"journal\":{\"name\":\"Topics in Cognitive Science\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2024-11-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Topics in Cognitive Science\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1111/tops.12767\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Topics in Cognitive Science","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1111/tops.12767","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
An enigma for human languages is that children learn to understand words in their mother tongue extremely fast. The cognitive sciences have not been able to fully understand the mechanisms behind this highly efficient learning process. In order to provide at least a partial answer to this problem, I have developed a cognitive model of the semantics of natural language in terms of conceptual spaces. I present a background to conceptual spaces and provide a brief summary of their main features, in particular how it handles learning of concepts. I then apply the model to give a geometric account of the semantics of different word classes. In particular, I propose a "single-domain hypotheses" for the semantics of all word classes except nouns. These hypotheses provide a partial answer to the enigma of how words are learned. Next, a dynamic cognitive model of events is introduced that replaces and extends the function of thematic roles. I apply it to analyze the meanings of different kinds of verbs. I argue that the model also explains some aspects of syntactic structure. In particular, I propose that a sentence typically refers to an event. Some further applications of conceptual spaces are briefly presented.
期刊介绍:
Topics in Cognitive Science (topiCS) is an innovative new journal that covers all areas of cognitive science including cognitive modeling, cognitive neuroscience, cognitive anthropology, and cognitive science and philosophy. topiCS aims to provide a forum for: -New communities of researchers- New controversies in established areas- Debates and commentaries- Reflections and integration The publication features multiple scholarly papers dedicated to a single topic. Some of these topics will appear together in one issue, but others may appear across several issues or develop into a regular feature. Controversies or debates started in one issue may be followed up by commentaries in a later issue, etc. However, the format and origin of the topics will vary greatly.