{"title":"一种新的复值神经网络激活函数:复Swish函数","authors":"M. Celebi, M. Ceylan","doi":"10.36287/setsci.4.6.050","DOIUrl":null,"url":null,"abstract":"Complex-valued artificial neural network (CVANN) has been developed to process data with complex numbers directly. Weights, threshold, inputs and outputs are all complex numbers in the CVANN. The convergence of the CVANN back propagation algorithm depends on some factors such as selection of appropriate activation function, threshold values, initial weights and normalization of data. The most important of these factors is the selection of the appropriate activation function. The selection of activation function determines the convergence and general formation characteristics of the complex back propagation algorithm. In this study, the swish activation function discovered by Google researchers Prajit Ramachandra, Barret Zoph and Quoc V. Le is discussed in the complex domain. Swish activation function, which gives good results in real plane, has been studied in the complex plane. We have compared the performance of swish activation functions on the complex XOR and symmetry problems with other known activation functions. The simulations’ results show that the proposed network using swish activation function, gives the best results when compared to other networks using the traditional complex logarithmic sigmoid and tangent sigmoid activation functions.","PeriodicalId":6817,"journal":{"name":"4th International Symposium on Innovative Approaches in Engineering and Natural Sciences Proceedings","volume":"49 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"The New Activation Function for Complex Valued Neural Networks: Complex Swish Function\",\"authors\":\"M. Celebi, M. Ceylan\",\"doi\":\"10.36287/setsci.4.6.050\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Complex-valued artificial neural network (CVANN) has been developed to process data with complex numbers directly. Weights, threshold, inputs and outputs are all complex numbers in the CVANN. The convergence of the CVANN back propagation algorithm depends on some factors such as selection of appropriate activation function, threshold values, initial weights and normalization of data. The most important of these factors is the selection of the appropriate activation function. The selection of activation function determines the convergence and general formation characteristics of the complex back propagation algorithm. In this study, the swish activation function discovered by Google researchers Prajit Ramachandra, Barret Zoph and Quoc V. Le is discussed in the complex domain. Swish activation function, which gives good results in real plane, has been studied in the complex plane. We have compared the performance of swish activation functions on the complex XOR and symmetry problems with other known activation functions. The simulations’ results show that the proposed network using swish activation function, gives the best results when compared to other networks using the traditional complex logarithmic sigmoid and tangent sigmoid activation functions.\",\"PeriodicalId\":6817,\"journal\":{\"name\":\"4th International Symposium on Innovative Approaches in Engineering and Natural Sciences Proceedings\",\"volume\":\"49 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-07-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"4th International Symposium on Innovative Approaches in Engineering and Natural Sciences Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.36287/setsci.4.6.050\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"4th International Symposium on Innovative Approaches in Engineering and Natural Sciences Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.36287/setsci.4.6.050","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
摘要
复值人工神经网络(CVANN)是一种直接处理复数数据的方法。权值、阈值、输入和输出在CVANN中都是复数。CVANN反向传播算法的收敛性取决于激活函数的选择、阈值、初始权值和数据的归一化等因素。这些因素中最重要的是选择合适的激活函数。激活函数的选择决定了复反向传播算法的收敛性和一般编队特性。在这项研究中,由谷歌研究员Prajit Ramachandra, Barret Zoph和Quoc V. Le发现的swish激活函数在复杂域进行了讨论。在复平面上对Swish激活函数进行了研究,该函数在实平面上得到了很好的结果。我们比较了swish激活函数与其他已知激活函数在复杂异或和对称问题上的性能。仿真结果表明,与传统的复对数sigmoid和正切sigmoid激活函数的网络相比,采用swish激活函数的网络具有最好的效果。
The New Activation Function for Complex Valued Neural Networks: Complex Swish Function
Complex-valued artificial neural network (CVANN) has been developed to process data with complex numbers directly. Weights, threshold, inputs and outputs are all complex numbers in the CVANN. The convergence of the CVANN back propagation algorithm depends on some factors such as selection of appropriate activation function, threshold values, initial weights and normalization of data. The most important of these factors is the selection of the appropriate activation function. The selection of activation function determines the convergence and general formation characteristics of the complex back propagation algorithm. In this study, the swish activation function discovered by Google researchers Prajit Ramachandra, Barret Zoph and Quoc V. Le is discussed in the complex domain. Swish activation function, which gives good results in real plane, has been studied in the complex plane. We have compared the performance of swish activation functions on the complex XOR and symmetry problems with other known activation functions. The simulations’ results show that the proposed network using swish activation function, gives the best results when compared to other networks using the traditional complex logarithmic sigmoid and tangent sigmoid activation functions.