Modeling visuospatial reasoning

IF 1.6 4区 心理学 Q3 PSYCHOLOGY, EXPERIMENTAL Spatial Cognition and Computation Pub Date : 2019-01-02 DOI:10.1080/13875868.2018.1460751
Stephen K. Reed
{"title":"Modeling visuospatial reasoning","authors":"Stephen K. Reed","doi":"10.1080/13875868.2018.1460751","DOIUrl":null,"url":null,"abstract":"ABSTRACT I apply my proposed modification of Soar/Spatial/Visual System and Kosslyn’s (1983) computational operations on images to problems within a 2 × 2 taxonomy that classifies research according to whether the coding involves static or dynamic relations within an object or between objects (Newcombe & Shipley, 2015). I then repeat this analysis for problems that are included in mathematics and science curricula. Because many of these problems involve reasoning from diagrams Hegarty’s (2011) framework for reasoning from visual-spatial displays provides additional support for organizing this topic. Two more relevant frameworks specify reasoning at different levels of abstraction (Reed, 2016) and with different combinations of actions and objects (Reed, 2018). The article concludes with suggestions for future directions.","PeriodicalId":46199,"journal":{"name":"Spatial Cognition and Computation","volume":"8 1","pages":"1 - 45"},"PeriodicalIF":1.6000,"publicationDate":"2019-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Spatial Cognition and Computation","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1080/13875868.2018.1460751","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 6

Abstract

ABSTRACT I apply my proposed modification of Soar/Spatial/Visual System and Kosslyn’s (1983) computational operations on images to problems within a 2 × 2 taxonomy that classifies research according to whether the coding involves static or dynamic relations within an object or between objects (Newcombe & Shipley, 2015). I then repeat this analysis for problems that are included in mathematics and science curricula. Because many of these problems involve reasoning from diagrams Hegarty’s (2011) framework for reasoning from visual-spatial displays provides additional support for organizing this topic. Two more relevant frameworks specify reasoning at different levels of abstraction (Reed, 2016) and with different combinations of actions and objects (Reed, 2018). The article concludes with suggestions for future directions.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
视觉空间推理建模
我将我提出的对Soar/Spatial/Visual System和Kosslyn(1983)对图像的计算操作的修改应用于2x2分类法中的问题,该分类法根据编码是否涉及对象内或对象之间的静态或动态关系对研究进行分类(Newcombe & Shipley, 2015)。然后,我对数学和科学课程中包含的问题重复这种分析。因为这些问题中的许多都涉及到图表推理,Hegarty(2011)的视觉空间显示推理框架为组织这一主题提供了额外的支持。两个更相关的框架规定了不同抽象层次的推理(Reed, 2016)和不同的动作和对象组合(Reed, 2018)。文章最后对未来的发展方向提出了建议。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Spatial Cognition and Computation
Spatial Cognition and Computation PSYCHOLOGY, EXPERIMENTAL-
CiteScore
4.40
自引率
5.30%
发文量
10
期刊最新文献
In memoriam: Christian Freksa (1950-2020) Treat robots as humans? Perspective choice in human-human and human-robot spatial language interaction Direction information is more influential than distance information in memory for location relative to landmarks Task-dependent sketch maps Evidence for flexible navigation strategies during spatial learning involving path choices
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1