{"title":"利用机器学习提高蜂窝网络的吞吐量——以LTE为例","authors":"Prasad Gaikwad, Saidhiraj Amuru, K. Kuchi","doi":"10.1109/NCC52529.2021.9530047","DOIUrl":null,"url":null,"abstract":"Long Term Evolution (LTE) focused on providing high data rates at low latency when compared to previous-generation technologies. The recent research and development in machine learning for wireless communication networks focus on making these networks more efficient, intelligent, and optimal. We propose a machine learning algorithm to improve the performance of LTE in a real-time deployments. Specifically, we focus on the case of single-user multiple-input multiple-output transmission mode (TM4 as known in LTE). The channel quality feedback from user to the base stations plays a crucial role to ensure successful communication with low error rate in this transmission mode. The feedback from the user includes precoding matrix indicator (PMI), rank indicator apart from the channel quality feedback. However, in practical systems, as the base station must support several users, there is a delay expected from the time a user sends feedback until the time it is scheduled. This time lag can cause significant performance degradation depending on the channel conditions and also in cases when the user is mobile. Hence, to eliminate this adverse impact, we present a machine learning model that predict future channels and the feedback from the user is calculated based on these predictions. Via several numerical simulations, we show the effectiveness of the proposed algorithms under a variety of scenarios. Without loss of generality, the same work can be applied in the context of 5G NR. LTE is used only as a case study due to its vast prevalence and deployments even as of today.","PeriodicalId":414087,"journal":{"name":"2021 National Conference on Communications (NCC)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improving the Throughput of a Cellular Network using Machine Learning - A Case Study of LTE\",\"authors\":\"Prasad Gaikwad, Saidhiraj Amuru, K. Kuchi\",\"doi\":\"10.1109/NCC52529.2021.9530047\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Long Term Evolution (LTE) focused on providing high data rates at low latency when compared to previous-generation technologies. The recent research and development in machine learning for wireless communication networks focus on making these networks more efficient, intelligent, and optimal. We propose a machine learning algorithm to improve the performance of LTE in a real-time deployments. Specifically, we focus on the case of single-user multiple-input multiple-output transmission mode (TM4 as known in LTE). The channel quality feedback from user to the base stations plays a crucial role to ensure successful communication with low error rate in this transmission mode. The feedback from the user includes precoding matrix indicator (PMI), rank indicator apart from the channel quality feedback. However, in practical systems, as the base station must support several users, there is a delay expected from the time a user sends feedback until the time it is scheduled. This time lag can cause significant performance degradation depending on the channel conditions and also in cases when the user is mobile. Hence, to eliminate this adverse impact, we present a machine learning model that predict future channels and the feedback from the user is calculated based on these predictions. Via several numerical simulations, we show the effectiveness of the proposed algorithms under a variety of scenarios. Without loss of generality, the same work can be applied in the context of 5G NR. LTE is used only as a case study due to its vast prevalence and deployments even as of today.\",\"PeriodicalId\":414087,\"journal\":{\"name\":\"2021 National Conference on Communications (NCC)\",\"volume\":\"10 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 National Conference on Communications (NCC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NCC52529.2021.9530047\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 National Conference on Communications (NCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NCC52529.2021.9530047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improving the Throughput of a Cellular Network using Machine Learning - A Case Study of LTE
Long Term Evolution (LTE) focused on providing high data rates at low latency when compared to previous-generation technologies. The recent research and development in machine learning for wireless communication networks focus on making these networks more efficient, intelligent, and optimal. We propose a machine learning algorithm to improve the performance of LTE in a real-time deployments. Specifically, we focus on the case of single-user multiple-input multiple-output transmission mode (TM4 as known in LTE). The channel quality feedback from user to the base stations plays a crucial role to ensure successful communication with low error rate in this transmission mode. The feedback from the user includes precoding matrix indicator (PMI), rank indicator apart from the channel quality feedback. However, in practical systems, as the base station must support several users, there is a delay expected from the time a user sends feedback until the time it is scheduled. This time lag can cause significant performance degradation depending on the channel conditions and also in cases when the user is mobile. Hence, to eliminate this adverse impact, we present a machine learning model that predict future channels and the feedback from the user is calculated based on these predictions. Via several numerical simulations, we show the effectiveness of the proposed algorithms under a variety of scenarios. Without loss of generality, the same work can be applied in the context of 5G NR. LTE is used only as a case study due to its vast prevalence and deployments even as of today.