Anas Skalli, Mirko Goldmann, Nasibeh Haghighi, Stephan Reitzenstein, James A. Lott, Daniel Brunner
{"title":"三元权重光神经网络的退火启发式训练","authors":"Anas Skalli, Mirko Goldmann, Nasibeh Haghighi, Stephan Reitzenstein, James A. Lott, Daniel Brunner","doi":"arxiv-2409.01042","DOIUrl":null,"url":null,"abstract":"Artificial neural networks (ANNs) represent a fundamentally connectionnist\nand distributed approach to computing, and as such they differ from classical\ncomputers that utilize the von Neumann architecture. This has revived research\ninterest in new unconventional hardware to enable more efficient\nimplementations of ANNs rather than emulating them on traditional machines. In\norder to fully leverage the capabilities of this new generation of ANNs,\noptimization algorithms that take into account hardware limitations and\nimperfections are necessary. Photonics represents a particularly promising\nplatform, offering scalability, high speed, energy efficiency, and the\ncapability for parallel information processing. Yet, fully fledged\nimplementations of autonomous optical neural networks (ONNs) with in-situ\nlearning remain scarce. In this work, we propose a ternary weight architecture\nhigh-dimensional semiconductor laser-based ONN. We introduce a simple method\nfor achieving ternary weights with Boolean hardware, significantly increasing\nthe ONN's information processing capabilities. Furthermore, we design a novel\nin-situ optimization algorithm that is compatible with, both, Boolean and\nternary weights, and provide a detailed hyperparameter study of said algorithm\nfor two different tasks. Our novel algorithm results in benefits, both in terms\nof convergence speed and performance. Finally, we experimentally characterize\nthe long-term inference stability of our ONN and find that it is extremely\nstable with a consistency above 99\\% over a period of more than 10 hours,\naddressing one of the main concerns in the field. Our work is of particular\nrelevance in the context of in-situ learning under restricted hardware\nresources, especially since minimizing the power consumption of auxiliary\nhardware is crucial to preserving efficiency gains achieved by non-von Neumann\nANN implementations.","PeriodicalId":501168,"journal":{"name":"arXiv - CS - Emerging Technologies","volume":"95 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Annealing-inspired training of an optical neural network with ternary weights\",\"authors\":\"Anas Skalli, Mirko Goldmann, Nasibeh Haghighi, Stephan Reitzenstein, James A. Lott, Daniel Brunner\",\"doi\":\"arxiv-2409.01042\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Artificial neural networks (ANNs) represent a fundamentally connectionnist\\nand distributed approach to computing, and as such they differ from classical\\ncomputers that utilize the von Neumann architecture. This has revived research\\ninterest in new unconventional hardware to enable more efficient\\nimplementations of ANNs rather than emulating them on traditional machines. In\\norder to fully leverage the capabilities of this new generation of ANNs,\\noptimization algorithms that take into account hardware limitations and\\nimperfections are necessary. Photonics represents a particularly promising\\nplatform, offering scalability, high speed, energy efficiency, and the\\ncapability for parallel information processing. Yet, fully fledged\\nimplementations of autonomous optical neural networks (ONNs) with in-situ\\nlearning remain scarce. In this work, we propose a ternary weight architecture\\nhigh-dimensional semiconductor laser-based ONN. We introduce a simple method\\nfor achieving ternary weights with Boolean hardware, significantly increasing\\nthe ONN's information processing capabilities. Furthermore, we design a novel\\nin-situ optimization algorithm that is compatible with, both, Boolean and\\nternary weights, and provide a detailed hyperparameter study of said algorithm\\nfor two different tasks. Our novel algorithm results in benefits, both in terms\\nof convergence speed and performance. Finally, we experimentally characterize\\nthe long-term inference stability of our ONN and find that it is extremely\\nstable with a consistency above 99\\\\% over a period of more than 10 hours,\\naddressing one of the main concerns in the field. Our work is of particular\\nrelevance in the context of in-situ learning under restricted hardware\\nresources, especially since minimizing the power consumption of auxiliary\\nhardware is crucial to preserving efficiency gains achieved by non-von Neumann\\nANN implementations.\",\"PeriodicalId\":501168,\"journal\":{\"name\":\"arXiv - CS - Emerging Technologies\",\"volume\":\"95 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Emerging Technologies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.01042\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Emerging Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.01042","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Annealing-inspired training of an optical neural network with ternary weights
Artificial neural networks (ANNs) represent a fundamentally connectionnist
and distributed approach to computing, and as such they differ from classical
computers that utilize the von Neumann architecture. This has revived research
interest in new unconventional hardware to enable more efficient
implementations of ANNs rather than emulating them on traditional machines. In
order to fully leverage the capabilities of this new generation of ANNs,
optimization algorithms that take into account hardware limitations and
imperfections are necessary. Photonics represents a particularly promising
platform, offering scalability, high speed, energy efficiency, and the
capability for parallel information processing. Yet, fully fledged
implementations of autonomous optical neural networks (ONNs) with in-situ
learning remain scarce. In this work, we propose a ternary weight architecture
high-dimensional semiconductor laser-based ONN. We introduce a simple method
for achieving ternary weights with Boolean hardware, significantly increasing
the ONN's information processing capabilities. Furthermore, we design a novel
in-situ optimization algorithm that is compatible with, both, Boolean and
ternary weights, and provide a detailed hyperparameter study of said algorithm
for two different tasks. Our novel algorithm results in benefits, both in terms
of convergence speed and performance. Finally, we experimentally characterize
the long-term inference stability of our ONN and find that it is extremely
stable with a consistency above 99\% over a period of more than 10 hours,
addressing one of the main concerns in the field. Our work is of particular
relevance in the context of in-situ learning under restricted hardware
resources, especially since minimizing the power consumption of auxiliary
hardware is crucial to preserving efficiency gains achieved by non-von Neumann
ANN implementations.