{"title":"The C-CNN model: Do we really need multiplicative synapses in convolutional neural networks?","authors":"R. Dogaru, Adrian-Dumitru Mirica, I. Dogaru","doi":"10.1109/comm54429.2022.9817267","DOIUrl":null,"url":null,"abstract":"Comparative synapses are proposed and investigated in the context of convolutional neural networks as replacements for the traditional, multiplier-based synapses. A comparative synapse is an operator inspired from the min() operator used in fuzzy-logic a replacement for product to implement AND function. Its implementation complexity is linear in the number of bits unlike multipliers, requiring quadratic complexity. In effect, using a typical resolution of 8 bits the use of comparative synapse would reduce 8 times the number of hardware resources allocated for the operator. A C-CNN model was constructed to support comparative synapses and their update and error propagation rules. GPU acceleration of the C-CNN model was achieved using CUPY. The model was trained with several widely known image recognition datasets including MNIST, CIFAR and USPS. It turns out that functional performance (accuracy) is not dramatically affected in C-CNN against a similar traditional CNN model with multiplicative operators, thus opening an interesting implementation perspective, particularly for the TinyML and HW-oriented solutions with significant reduction in energy, silicon area and costs. The approach is scalable to more sophisticated CNN models providing adequate optimized operators adapted to this new synaptic model.","PeriodicalId":118077,"journal":{"name":"2022 14th International Conference on Communications (COMM)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 14th International Conference on Communications (COMM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/comm54429.2022.9817267","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Comparative synapses are proposed and investigated in the context of convolutional neural networks as replacements for the traditional, multiplier-based synapses. A comparative synapse is an operator inspired from the min() operator used in fuzzy-logic a replacement for product to implement AND function. Its implementation complexity is linear in the number of bits unlike multipliers, requiring quadratic complexity. In effect, using a typical resolution of 8 bits the use of comparative synapse would reduce 8 times the number of hardware resources allocated for the operator. A C-CNN model was constructed to support comparative synapses and their update and error propagation rules. GPU acceleration of the C-CNN model was achieved using CUPY. The model was trained with several widely known image recognition datasets including MNIST, CIFAR and USPS. It turns out that functional performance (accuracy) is not dramatically affected in C-CNN against a similar traditional CNN model with multiplicative operators, thus opening an interesting implementation perspective, particularly for the TinyML and HW-oriented solutions with significant reduction in energy, silicon area and costs. The approach is scalable to more sophisticated CNN models providing adequate optimized operators adapted to this new synaptic model.