{"title":"Two-dimensional sparse fractional Fourier transform and its applications","authors":"Deyun Wei, Jun Yang","doi":"10.2139/ssrn.4103340","DOIUrl":"https://doi.org/10.2139/ssrn.4103340","url":null,"abstract":"","PeriodicalId":21745,"journal":{"name":"Signal Process.","volume":"17 1","pages":"108682"},"PeriodicalIF":0.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72549287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-01DOI: 10.1016/j.sigpro.2022.108693
M. Tanda
{"title":"Asymptotic performance of FBMC-PAM systems in frequency-selective Rayleigh fading channels","authors":"M. Tanda","doi":"10.1016/j.sigpro.2022.108693","DOIUrl":"https://doi.org/10.1016/j.sigpro.2022.108693","url":null,"abstract":"","PeriodicalId":21745,"journal":{"name":"Signal Process.","volume":"72 1","pages":"108693"},"PeriodicalIF":0.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89822702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-01DOI: 10.1016/j.sigpro.2022.108692
Y. Wang, Wen-Xia Yang, Dan Li, J. Zhang
{"title":"A novel time-frequency model, analysis and parameter estimation approach: Towards multiple close and crossed chirp modes","authors":"Y. Wang, Wen-Xia Yang, Dan Li, J. Zhang","doi":"10.1016/j.sigpro.2022.108692","DOIUrl":"https://doi.org/10.1016/j.sigpro.2022.108692","url":null,"abstract":"","PeriodicalId":21745,"journal":{"name":"Signal Process.","volume":"116 1","pages":"108692"},"PeriodicalIF":0.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88471551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Reversible data hiding in JPEG images based on coefficient-first selection","authors":"Xie Yang, Taoyu Wu, Fangjun Huang","doi":"10.2139/ssrn.4021942","DOIUrl":"https://doi.org/10.2139/ssrn.4021942","url":null,"abstract":"","PeriodicalId":21745,"journal":{"name":"Signal Process.","volume":"1 1","pages":"108639"},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88582433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The task of recovering a low-rank matrix given an incomplete matrix, also termed as matrix completion, arises in various applications. Methods for matrix completion can be classified into linear and nonlinear approaches. Despite the fact that the linear model provides basic theories ensuring restoring the missing entries with high probability, it has an obvious limitation that latent factors are restricted in the linear subspace. Thus, the nonlinear model has been suggested, which is mainly performed using neural networks. In this paper, a novel and interpretable neural network is developed for matrix completion. Different from existing neural networks whose structure is created by empirical design, the proposed version is devised via unfolding the matrix factorization formulation. Specifically, the two factors decomposed by matrix factorization construct the two branches of the suggested neural network, called bi-branch neural network (BiBNN). The row and column indices of each entry are considered as the input of the BiBNN, while its output is the estimated value of the entry. The training procedure aims to minimize the fit-ting error between all observed entries and their predicted values and then the unknown entries are estimated by inputting their coordinates into the trained network. The BiBNN is compared with state-of-the-art methods, including linear and nonlinear models, in processing synthetic data, image inpainting, and recommender system. Experimental results demonstrate that the BiBNN is superior to the existing approaches in terms of restoration accuracy.
{"title":"An interpretable bi-branch neural network for matrix completion","authors":"Xiao Peng Li, Maolin Wang, H. So","doi":"10.2139/ssrn.4006034","DOIUrl":"https://doi.org/10.2139/ssrn.4006034","url":null,"abstract":"The task of recovering a low-rank matrix given an incomplete matrix, also termed as matrix completion, arises in various applications. Methods for matrix completion can be classified into linear and nonlinear approaches. Despite the fact that the linear model provides basic theories ensuring restoring the missing entries with high probability, it has an obvious limitation that latent factors are restricted in the linear subspace. Thus, the nonlinear model has been suggested, which is mainly performed using neural networks. In this paper, a novel and interpretable neural network is developed for matrix completion. Different from existing neural networks whose structure is created by empirical design, the proposed version is devised via unfolding the matrix factorization formulation. Specifically, the two factors decomposed by matrix factorization construct the two branches of the suggested neural network, called bi-branch neural network (BiBNN). The row and column indices of each entry are considered as the input of the BiBNN, while its output is the estimated value of the entry. The training procedure aims to minimize the fit-ting error between all observed entries and their predicted values and then the unknown entries are estimated by inputting their coordinates into the trained network. The BiBNN is compared with state-of-the-art methods, including linear and nonlinear models, in processing synthetic data, image inpainting, and recommender system. Experimental results demonstrate that the BiBNN is superior to the existing approaches in terms of restoration accuracy.","PeriodicalId":21745,"journal":{"name":"Signal Process.","volume":"7 1","pages":"108640"},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79312502","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-05-09DOI: 10.1016/j.sigpro.2022.108728
Y. Ono, Linyu Peng
{"title":"Towards a median signal detector through the total Bregman divergence and its robustness analysis","authors":"Y. Ono, Linyu Peng","doi":"10.1016/j.sigpro.2022.108728","DOIUrl":"https://doi.org/10.1016/j.sigpro.2022.108728","url":null,"abstract":"","PeriodicalId":21745,"journal":{"name":"Signal Process.","volume":"46 1","pages":"108728"},"PeriodicalIF":0.0,"publicationDate":"2022-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77248478","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}