{"title":"在上行毫米波信道上能量收集源的aoi感知状态更新控制","authors":"Marzieh Sheikhi, Vesal Hakami","doi":"10.1109/ICSPIS54653.2021.9729335","DOIUrl":null,"url":null,"abstract":"In the new generation networks, the freshness of the data plays a prominent role in real-time systems. The novel metric of the age of information (AoI) measures the elapsed time since the generation of the latest received data. This paper considers a real-time scenario where a source node samples and forwards the measurements to a monitoring center over a millimeter-wave (mmWave) channel. The source node is also equipped with a finite rechargeable battery to harvest energy from the environment. We propose a remote monitoring problem that considers the tradeoff between the minimization of long-term average AoI and the energy usage of the source node. We formulate the problem as an MDP model, and as a model-free reinforcement learning approach, we utilize the Q-learning algorithm to obtain the optimal policy that minimizes the long-term average AoI. Our evaluations investigate the convergence property as well as the impact of changing the problem parameters on the average AoI and average energy consumption. Simulation results show that compared to two other baselines (i.e., random and greedy (myopic) policy), the proposed Q-Learning based algorithm is able to keep the data fresh and consumes less energy by considering the possible future system states.","PeriodicalId":286966,"journal":{"name":"2021 7th International Conference on Signal Processing and Intelligent Systems (ICSPIS)","volume":"58 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"AoI-Aware Status Update Control for an Energy Harvesting Source over an Uplink mmWave Channel\",\"authors\":\"Marzieh Sheikhi, Vesal Hakami\",\"doi\":\"10.1109/ICSPIS54653.2021.9729335\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the new generation networks, the freshness of the data plays a prominent role in real-time systems. The novel metric of the age of information (AoI) measures the elapsed time since the generation of the latest received data. This paper considers a real-time scenario where a source node samples and forwards the measurements to a monitoring center over a millimeter-wave (mmWave) channel. The source node is also equipped with a finite rechargeable battery to harvest energy from the environment. We propose a remote monitoring problem that considers the tradeoff between the minimization of long-term average AoI and the energy usage of the source node. We formulate the problem as an MDP model, and as a model-free reinforcement learning approach, we utilize the Q-learning algorithm to obtain the optimal policy that minimizes the long-term average AoI. Our evaluations investigate the convergence property as well as the impact of changing the problem parameters on the average AoI and average energy consumption. Simulation results show that compared to two other baselines (i.e., random and greedy (myopic) policy), the proposed Q-Learning based algorithm is able to keep the data fresh and consumes less energy by considering the possible future system states.\",\"PeriodicalId\":286966,\"journal\":{\"name\":\"2021 7th International Conference on Signal Processing and Intelligent Systems (ICSPIS)\",\"volume\":\"58 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 7th International Conference on Signal Processing and Intelligent Systems (ICSPIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICSPIS54653.2021.9729335\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 7th International Conference on Signal Processing and Intelligent Systems (ICSPIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSPIS54653.2021.9729335","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
AoI-Aware Status Update Control for an Energy Harvesting Source over an Uplink mmWave Channel
In the new generation networks, the freshness of the data plays a prominent role in real-time systems. The novel metric of the age of information (AoI) measures the elapsed time since the generation of the latest received data. This paper considers a real-time scenario where a source node samples and forwards the measurements to a monitoring center over a millimeter-wave (mmWave) channel. The source node is also equipped with a finite rechargeable battery to harvest energy from the environment. We propose a remote monitoring problem that considers the tradeoff between the minimization of long-term average AoI and the energy usage of the source node. We formulate the problem as an MDP model, and as a model-free reinforcement learning approach, we utilize the Q-learning algorithm to obtain the optimal policy that minimizes the long-term average AoI. Our evaluations investigate the convergence property as well as the impact of changing the problem parameters on the average AoI and average energy consumption. Simulation results show that compared to two other baselines (i.e., random and greedy (myopic) policy), the proposed Q-Learning based algorithm is able to keep the data fresh and consumes less energy by considering the possible future system states.