{"title":"L1/2 Regularization-Based Deep Incremental Non-negative Matrix Factorization for Tumor Recognition","authors":"Lulu Yan, Xiaohui Yang","doi":"10.1145/3469678.3469691","DOIUrl":null,"url":null,"abstract":"Non-negative matrix factorization (NMF) is an effective technique for feature representation learning and dimensionality reduction. However, there are two critical challenges for improving the performance of NMF-based methods. One is the sparsity of representation, the other is the sensitivity to the initial value of the iteration, which seriously affects the performance of NMF. To solve the problems, L1/2 regularization is skillfully selected to characterize the sparsity of the data. Furthermore, a layer-wise pre-training strategy in deep learning is used to alleviate the effect of the initial value on NMF, whereby complex network structure is avoided. As such, a L1/2 regularization-based deep NMF (L1/2-DNMF) model is proposed in this study, such that a more stable and sparse deep representation is obtained. Moreover, incremental learning is introduced to reduce the high computational complexity of L1/2-DNMF model, called L1/2-DINMF model, which is suitable for online processing. Experiment results on genetic data-based tumor recognition verify that the proposed L1/2-DINMF model outperforms the classic and state-of-the-art methods.","PeriodicalId":22513,"journal":{"name":"The Fifth International Conference on Biological Information and Biomedical Engineering","volume":"70 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-07-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Fifth International Conference on Biological Information and Biomedical Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3469678.3469691","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Non-negative matrix factorization (NMF) is an effective technique for feature representation learning and dimensionality reduction. However, there are two critical challenges for improving the performance of NMF-based methods. One is the sparsity of representation, the other is the sensitivity to the initial value of the iteration, which seriously affects the performance of NMF. To solve the problems, L1/2 regularization is skillfully selected to characterize the sparsity of the data. Furthermore, a layer-wise pre-training strategy in deep learning is used to alleviate the effect of the initial value on NMF, whereby complex network structure is avoided. As such, a L1/2 regularization-based deep NMF (L1/2-DNMF) model is proposed in this study, such that a more stable and sparse deep representation is obtained. Moreover, incremental learning is introduced to reduce the high computational complexity of L1/2-DNMF model, called L1/2-DINMF model, which is suitable for online processing. Experiment results on genetic data-based tumor recognition verify that the proposed L1/2-DINMF model outperforms the classic and state-of-the-art methods.