{"title":"Sparse adaptive memory and handwritten digit recognition","authors":"B. Flachs, M. Flynn","doi":"10.1109/ICNN.1994.374336","DOIUrl":null,"url":null,"abstract":"Pattern recognition is a budding field with many possible approaches. This article describes sparse adaptive memory (SARI), an associative memory built upon the strengths of Parzen classifiers, nearest neighbor classifiers, feedforward neural networks, and is related to learning vector quantization. A key feature of this learning architecture is the ability to adaptively change its prototype patterns in addition to its output mapping. As SAM changes the prototype patterns in the list, it isolates modes in the density functions to produce a classifier that is in some senses optimal. Some very important interactions of gradient descent learning are exposed, providing conditions under which gradient descent will converge to an admissible solution in an associative memory structure. A layer of learning heuristics can be built upon the basic gradient descent learning algorithm to improve memory efficiency in terms of error rate, and therefore hardware requirements. A simulation study examines the effects of one such heuristic in the context of handwritten digit recognition.<<ETX>>","PeriodicalId":209128,"journal":{"name":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNN.1994.374336","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Pattern recognition is a budding field with many possible approaches. This article describes sparse adaptive memory (SARI), an associative memory built upon the strengths of Parzen classifiers, nearest neighbor classifiers, feedforward neural networks, and is related to learning vector quantization. A key feature of this learning architecture is the ability to adaptively change its prototype patterns in addition to its output mapping. As SAM changes the prototype patterns in the list, it isolates modes in the density functions to produce a classifier that is in some senses optimal. Some very important interactions of gradient descent learning are exposed, providing conditions under which gradient descent will converge to an admissible solution in an associative memory structure. A layer of learning heuristics can be built upon the basic gradient descent learning algorithm to improve memory efficiency in terms of error rate, and therefore hardware requirements. A simulation study examines the effects of one such heuristic in the context of handwritten digit recognition.<>