首页 > 最新文献

2022 Conference on Cognitive Computational Neuroscience最新文献

英文 中文
Deriving Loss Functions for Regression and Classification from Humans 人类回归与分类的损失函数推导
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1237-0
Hansol X. Ryu, M. Srinivasan
{"title":"Deriving Loss Functions for Regression and Classification from Humans","authors":"Hansol X. Ryu, M. Srinivasan","doi":"10.32470/ccn.2022.1237-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1237-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"154 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134206068","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
VOneCAE: Interpreting through the eyes of V1 VOneCAE:通过V1的眼睛进行解读
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1173-0
Subhrasankar Chatterjee, Debasis Samanta
{"title":"VOneCAE: Interpreting through the eyes of V1","authors":"Subhrasankar Chatterjee, Debasis Samanta","doi":"10.32470/ccn.2022.1173-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1173-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133837483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
How many non-linear computations are required for CNNs to account for the response properties of V1? cnn需要多少非线性计算来解释V1的响应特性?
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1217-0
Hui-Yuan Miao, Hojin Jang, F. Tong
{"title":"How many non-linear computations are required for CNNs to account for the response properties of V1?","authors":"Hui-Yuan Miao, Hojin Jang, F. Tong","doi":"10.32470/ccn.2022.1217-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1217-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133886988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The neurobiology of strategic competition 战略竞争的神经生物学
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1270-0
Yaoguang Jiang, M. Platt
{"title":"The neurobiology of strategic competition","authors":"Yaoguang Jiang, M. Platt","doi":"10.32470/ccn.2022.1270-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1270-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134618448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Multivariate Point Process Model for Neural Spike Trains 神经脉冲序列的多变量点过程模型
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1153-0
R. Ramezan, Mei-Ching Chen, Martin Lysy, P. Marriott
{"title":"A Multivariate Point Process Model for Neural Spike Trains","authors":"R. Ramezan, Mei-Ching Chen, Martin Lysy, P. Marriott","doi":"10.32470/ccn.2022.1153-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1153-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121665736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Models of confidence to facilitate engaging task designs 建立自信模型,促进参与任务设计
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1150-0
Vanessa Ceja, Yussuf Ezzeldine, Megan A. K. Peters
{"title":"Models of confidence to facilitate engaging task designs","authors":"Vanessa Ceja, Yussuf Ezzeldine, Megan A. K. Peters","doi":"10.32470/ccn.2022.1150-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1150-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114848237","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dissociation Between The Use of Implicit and Explicit Priors in Bayesian Perceptual Inference 贝叶斯知觉推理中内隐先验和外显先验的分离
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1064-0
Caroline Bévalot, Florent Meyniel
{"title":"Dissociation Between The Use of Implicit and Explicit Priors in Bayesian Perceptual Inference","authors":"Caroline Bévalot, Florent Meyniel","doi":"10.32470/ccn.2022.1064-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1064-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116086188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Efficiency of object recognition networks on an absolute scale 目标识别网络的绝对效率
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1156-0
R. Murray, Devin Kehoe
: Deep neural networks have made rapid advances in object recognition, but progress has mostly been made through experimentation, with little guidance from normative theories. Here we use ideal observer theory and associated methods to compare current network performance to theoretical limits on performance. We measure network performance and ideal observer performance on a modified ImageNet task, where model observers view samples from a limited number of object categories, in several levels of external white Gaussian noise. We find that although current networks achieve 90% performance or better on the standard ImageNet task, the ideal observer performs vastly better on the more limited task we consider here. The networks' "calculation efficiency", a measure of the extent to which they use all available information to perform a task, is on the order of 10 -5 , an exceedingly small value. We consider reasons why efficiency may be so low, and outline further uses of ideal obsevers and noise methods to understand network performance.
深度神经网络在物体识别方面取得了快速进展,但进展主要是通过实验取得的,很少有规范理论的指导。在这里,我们使用理想观测器理论和相关方法来比较当前网络性能和性能的理论限制。我们在改进的ImageNet任务上测量网络性能和理想观测器性能,其中模型观测器在几个级别的外部高斯白噪声中查看有限数量的对象类别的样本。我们发现,尽管目前的网络在标准ImageNet任务上的性能达到90%或更高,但理想的观测器在我们这里考虑的更有限的任务上的性能要好得多。这些网络的“计算效率”(衡量它们利用所有可用信息来完成一项任务的程度)在10 -5的数量级上,这是一个非常小的值。我们考虑了效率可能如此之低的原因,并概述了理想观察器和噪声方法的进一步使用,以了解网络性能。
{"title":"Efficiency of object recognition networks on an absolute scale","authors":"R. Murray, Devin Kehoe","doi":"10.32470/ccn.2022.1156-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1156-0","url":null,"abstract":": Deep neural networks have made rapid advances in object recognition, but progress has mostly been made through experimentation, with little guidance from normative theories. Here we use ideal observer theory and associated methods to compare current network performance to theoretical limits on performance. We measure network performance and ideal observer performance on a modified ImageNet task, where model observers view samples from a limited number of object categories, in several levels of external white Gaussian noise. We find that although current networks achieve 90% performance or better on the standard ImageNet task, the ideal observer performs vastly better on the more limited task we consider here. The networks' \"calculation efficiency\", a measure of the extent to which they use all available information to perform a task, is on the order of 10 -5 , an exceedingly small value. We consider reasons why efficiency may be so low, and outline further uses of ideal obsevers and noise methods to understand network performance.","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121171190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dynamical Models of Decision Confidence in Visual Perception: Implementation and Comparison 视觉感知决策信心的动态模型:实现与比较
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1079-0
Sebastian Hellmann, Michael Zehetleitner, Manuel Rausch
{"title":"Dynamical Models of Decision Confidence in Visual Perception: Implementation and Comparison","authors":"Sebastian Hellmann, Michael Zehetleitner, Manuel Rausch","doi":"10.32470/ccn.2022.1079-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1079-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121311058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Unsupervised learning of translucent material appearance using StyleGAN 使用StyleGAN对半透明材料外观进行无监督学习
Pub Date : 1900-01-01 DOI: 10.32470/ccn.2022.1114-0
C. Liao, Masataka Sawayama, Bei Xiao
{"title":"Unsupervised learning of translucent material appearance using StyleGAN","authors":"C. Liao, Masataka Sawayama, Bei Xiao","doi":"10.32470/ccn.2022.1114-0","DOIUrl":"https://doi.org/10.32470/ccn.2022.1114-0","url":null,"abstract":"","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114734427","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
2022 Conference on Cognitive Computational Neuroscience
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1