{"title":"学习快速优化混合预编码","authors":"Ortal Agiv, Nir Shlezinger","doi":"10.1109/spawc51304.2022.9833923","DOIUrl":null,"url":null,"abstract":"Hybrid precoding is expected to play a key role in realizing massive multiple-input multiple-output (MIMO) transmitters with controllable cost, size, and power. MIMO transmitters are required to frequently adapt their precoding patterns based on the variation in the channel conditions. In the hybrid setting, such an adaptation often involves lengthy optimization which may affect the network performance. In this work we employ the emerging learn-to-optimize paradigm to enable rapid optimization of hybrid precoders. In particular, we leverage data to learn iteration-dependent hyperparameter setting of projected gradient optimization, thus preserving the fully interpretable flow of the optimizer while improving its convergence speed. Numerical results demonstrate that our approach yields six to twelve times faster convergence compared to conventional optimization with shared hyperparameters, while achieving similar and even improved sum-rate performance.","PeriodicalId":423807,"journal":{"name":"2022 IEEE 23rd International Workshop on Signal Processing Advances in Wireless Communication (SPAWC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Learn to Rapidly Optimize Hybrid Precoding\",\"authors\":\"Ortal Agiv, Nir Shlezinger\",\"doi\":\"10.1109/spawc51304.2022.9833923\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hybrid precoding is expected to play a key role in realizing massive multiple-input multiple-output (MIMO) transmitters with controllable cost, size, and power. MIMO transmitters are required to frequently adapt their precoding patterns based on the variation in the channel conditions. In the hybrid setting, such an adaptation often involves lengthy optimization which may affect the network performance. In this work we employ the emerging learn-to-optimize paradigm to enable rapid optimization of hybrid precoders. In particular, we leverage data to learn iteration-dependent hyperparameter setting of projected gradient optimization, thus preserving the fully interpretable flow of the optimizer while improving its convergence speed. Numerical results demonstrate that our approach yields six to twelve times faster convergence compared to conventional optimization with shared hyperparameters, while achieving similar and even improved sum-rate performance.\",\"PeriodicalId\":423807,\"journal\":{\"name\":\"2022 IEEE 23rd International Workshop on Signal Processing Advances in Wireless Communication (SPAWC)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 23rd International Workshop on Signal Processing Advances in Wireless Communication (SPAWC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/spawc51304.2022.9833923\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 23rd International Workshop on Signal Processing Advances in Wireless Communication (SPAWC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/spawc51304.2022.9833923","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

混合预编码有望在实现成本、尺寸和功率可控的大规模多输入多输出(MIMO)发射机方面发挥关键作用。MIMO发射机需要根据信道条件的变化频繁地调整其预编码模式。在混合设置中,这种适应通常涉及冗长的优化,这可能会影响网络性能。在这项工作中,我们采用新兴的学习优化范式来实现混合预编码器的快速优化。特别是,我们利用数据学习投影梯度优化的迭代依赖超参数设置,从而在保持优化器的完全可解释流的同时提高了其收敛速度。数值结果表明,与具有共享超参数的传统优化相比,我们的方法收敛速度快6到12倍,同时实现了相似甚至改进的和速率性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Learn to Rapidly Optimize Hybrid Precoding
Hybrid precoding is expected to play a key role in realizing massive multiple-input multiple-output (MIMO) transmitters with controllable cost, size, and power. MIMO transmitters are required to frequently adapt their precoding patterns based on the variation in the channel conditions. In the hybrid setting, such an adaptation often involves lengthy optimization which may affect the network performance. In this work we employ the emerging learn-to-optimize paradigm to enable rapid optimization of hybrid precoders. In particular, we leverage data to learn iteration-dependent hyperparameter setting of projected gradient optimization, thus preserving the fully interpretable flow of the optimizer while improving its convergence speed. Numerical results demonstrate that our approach yields six to twelve times faster convergence compared to conventional optimization with shared hyperparameters, while achieving similar and even improved sum-rate performance.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Secure Multi-Antenna Coded Caching Deep Transfer Learning Based Radio Map Estimation for Indoor Wireless Communications A New Outage Probability Bound for IR-HARQ and Its Application to Power Adaptation SPAWC 2022 Cover Page A Sequential Experience-driven Contextual Bandit Policy for MIMO TWAF Online Relay Selection
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1