{"title":"From Cognition to Computation: A Comparative Review of Human Attention and Transformer Architectures","authors":"Minglu Zhao, Dehong Xu, Tao Gao","doi":"arxiv-2407.01548","DOIUrl":null,"url":null,"abstract":"Attention is a cornerstone of human cognition that facilitates the efficient\nextraction of information in everyday life. Recent developments in artificial\nintelligence like the Transformer architecture also incorporate the idea of\nattention in model designs. However, despite the shared fundamental principle\nof selectively attending to information, human attention and the Transformer\nmodel display notable differences, particularly in their capacity constraints,\nattention pathways, and intentional mechanisms. Our review aims to provide a\ncomparative analysis of these mechanisms from a cognitive-functional\nperspective, thereby shedding light on several open research questions. The\nexploration encourages interdisciplinary efforts to derive insights from human\nattention mechanisms in the pursuit of developing more generalized artificial\nintelligence.","PeriodicalId":501219,"journal":{"name":"arXiv - QuanBio - Other Quantitative Biology","volume":"211 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Other Quantitative Biology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.01548","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Attention is a cornerstone of human cognition that facilitates the efficient
extraction of information in everyday life. Recent developments in artificial
intelligence like the Transformer architecture also incorporate the idea of
attention in model designs. However, despite the shared fundamental principle
of selectively attending to information, human attention and the Transformer
model display notable differences, particularly in their capacity constraints,
attention pathways, and intentional mechanisms. Our review aims to provide a
comparative analysis of these mechanisms from a cognitive-functional
perspective, thereby shedding light on several open research questions. The
exploration encourages interdisciplinary efforts to derive insights from human
attention mechanisms in the pursuit of developing more generalized artificial
intelligence.