{"title":"Design of a Genetic Algorithm Based Dynamic Learning Method for Improved Channel Modelling in mmWave Radios via Temporal Breakpoint Analysis","authors":"A. Bhoi, V. Hendre","doi":"10.18280/ts.400420","DOIUrl":null,"url":null,"abstract":"Dynamic channel modelling allows communication interfaces to integrate continuous learning operations for incremental BER reductions. These models scan temporal BER patterns, and then tune internal-channel parameters in order to improving communication efficiency under real-time traffic scenarios. But these models showcase high complexity, thus cannot be scaled to large-scale network deployments. Moreover, these models are not flexible, and do not support denser channel models, which restricts their applicability under real-time scenarios. To overcome these issues, this text proposes design of a novel dynamic learning method for improved channel modelling in Phased array antennas mm Wave radios via temporal breakpoint analysis. The model initially collects information about channel BER and uses a Grey Wolf Optimization (GWO) technique to improve its internal model parameters. These parameters are further tuned via a novel breakpoint model, which enables for continuous and light-weighted tuning of channel modelling parameters. This allows the model to incrementally reduce BER even under denser noise levels. The model is further cascaded with a Q-Learning based optimization process, which assists in improving channel modelling efficiency for large-scale networks. Due to these integrations, the model is capable of reducing Bit Error Rate (BER) by 8.3% when compared with standard channel modelling techniques that use Convolutional Neural Networks (CNNs), Sparse Bayesian Learning, etc. These methods were selected for comparison due to their higher efficiency and scalability when applied to real-time communication scenarios. The model also showcased 6.5% lower computational delay due to linear processing operations. It was able to achieve 10.4% better channel coverage, 8.5% higher throughput, and 4.9% higher channel estimation accuracy, which makes it useful for a wide","PeriodicalId":49430,"journal":{"name":"Traitement Du Signal","volume":" ","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Traitement Du Signal","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.18280/ts.400420","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Dynamic channel modelling allows communication interfaces to integrate continuous learning operations for incremental BER reductions. These models scan temporal BER patterns, and then tune internal-channel parameters in order to improving communication efficiency under real-time traffic scenarios. But these models showcase high complexity, thus cannot be scaled to large-scale network deployments. Moreover, these models are not flexible, and do not support denser channel models, which restricts their applicability under real-time scenarios. To overcome these issues, this text proposes design of a novel dynamic learning method for improved channel modelling in Phased array antennas mm Wave radios via temporal breakpoint analysis. The model initially collects information about channel BER and uses a Grey Wolf Optimization (GWO) technique to improve its internal model parameters. These parameters are further tuned via a novel breakpoint model, which enables for continuous and light-weighted tuning of channel modelling parameters. This allows the model to incrementally reduce BER even under denser noise levels. The model is further cascaded with a Q-Learning based optimization process, which assists in improving channel modelling efficiency for large-scale networks. Due to these integrations, the model is capable of reducing Bit Error Rate (BER) by 8.3% when compared with standard channel modelling techniques that use Convolutional Neural Networks (CNNs), Sparse Bayesian Learning, etc. These methods were selected for comparison due to their higher efficiency and scalability when applied to real-time communication scenarios. The model also showcased 6.5% lower computational delay due to linear processing operations. It was able to achieve 10.4% better channel coverage, 8.5% higher throughput, and 4.9% higher channel estimation accuracy, which makes it useful for a wide
期刊介绍:
The TS provides rapid dissemination of original research in the field of signal processing, imaging and visioning. Since its founding in 1984, the journal has published articles that present original research results of a fundamental, methodological or applied nature. The editorial board welcomes articles on the latest and most promising results of academic research, including both theoretical results and case studies.
The TS welcomes original research papers, technical notes and review articles on various disciplines, including but not limited to:
Signal processing
Imaging
Visioning
Control
Filtering
Compression
Data transmission
Noise reduction
Deconvolution
Prediction
Identification
Classification.