{"title":"A robust canonical correlation neural network","authors":"Zhenkun Gou, C. Fyfe","doi":"10.1109/NNSP.2002.1030035","DOIUrl":null,"url":null,"abstract":"We review a neural implementation of canonical correlation analysis and show, using ideas suggested by ridge regression, how to make the algorithm robust. The network is shown to operate on data sets which exhibit multicollinearity. We develop a second model which not only performs as well on multicollinear data but also on general data sets. This model allows us to vary a single parameter so that the network is capable of performing partial least squares regression (at one extreme) to canonical correlation analysis (at the other) and every intermediate operation between the two. On multicollinear data, the parameter setting is shown to be important but on more general data no particular parameter setting is required. Finally, the algorithm acts on such data as a smoother in that the resulting weight vectors are much smoother and more interpretable than the weights without the robustification term.","PeriodicalId":117945,"journal":{"name":"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.2002.1030035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
We review a neural implementation of canonical correlation analysis and show, using ideas suggested by ridge regression, how to make the algorithm robust. The network is shown to operate on data sets which exhibit multicollinearity. We develop a second model which not only performs as well on multicollinear data but also on general data sets. This model allows us to vary a single parameter so that the network is capable of performing partial least squares regression (at one extreme) to canonical correlation analysis (at the other) and every intermediate operation between the two. On multicollinear data, the parameter setting is shown to be important but on more general data no particular parameter setting is required. Finally, the algorithm acts on such data as a smoother in that the resulting weight vectors are much smoother and more interpretable than the weights without the robustification term.