{"title":"RECC:使用深度强化学习的关系增强内容缓存算法","authors":"Jiarui Ren, Haiyan Zhang, Xiaoping Zhou, Menghan Zhu","doi":"10.1109/ACAIT56212.2022.10137967","DOIUrl":null,"url":null,"abstract":"Mobile edge caching (MEC) is a promising technology to alleviate traffic congestion in the network. Current studies explored deep reinforcement learning (DRL)-based MEC methods. These methods consider the dynamics of the request size to maximize the cache hit rate. However, they usually ignored the potential request relationships among contents. Two contents with a strong relationship are usually requested sequentially. Inspired by this assumption, this paper proposes a relationship-enhanced content caching algorithm using DRL, named RECC. Our RECC infers user preferences by mining the request relationships among contents. In this work, the relationships are modeled as request sequences, and the request features are learned by using graph embedding. These features will be used as input of state in our DRL-based algorithm. We utilize the Wolpertinger architecture to solve the limitation of large discrete action space. The simulation results indicate that our RECC outperformed the traditional cache policies and state-of-the-art DRL-based method in cache hit rate. Furthermore, the proposed RECC has advantages in long-term stability in the environment where content popularity changes dynamically, and also has a higher cache hit rate when handling the requests with number changes dynamically.","PeriodicalId":398228,"journal":{"name":"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"RECC: A Relationship-Enhanced Content Caching Algorithm Using Deep Reinforcement Learning\",\"authors\":\"Jiarui Ren, Haiyan Zhang, Xiaoping Zhou, Menghan Zhu\",\"doi\":\"10.1109/ACAIT56212.2022.10137967\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Mobile edge caching (MEC) is a promising technology to alleviate traffic congestion in the network. Current studies explored deep reinforcement learning (DRL)-based MEC methods. These methods consider the dynamics of the request size to maximize the cache hit rate. However, they usually ignored the potential request relationships among contents. Two contents with a strong relationship are usually requested sequentially. Inspired by this assumption, this paper proposes a relationship-enhanced content caching algorithm using DRL, named RECC. Our RECC infers user preferences by mining the request relationships among contents. In this work, the relationships are modeled as request sequences, and the request features are learned by using graph embedding. These features will be used as input of state in our DRL-based algorithm. We utilize the Wolpertinger architecture to solve the limitation of large discrete action space. The simulation results indicate that our RECC outperformed the traditional cache policies and state-of-the-art DRL-based method in cache hit rate. Furthermore, the proposed RECC has advantages in long-term stability in the environment where content popularity changes dynamically, and also has a higher cache hit rate when handling the requests with number changes dynamically.\",\"PeriodicalId\":398228,\"journal\":{\"name\":\"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ACAIT56212.2022.10137967\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 6th Asian Conference on Artificial Intelligence Technology (ACAIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACAIT56212.2022.10137967","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
RECC: A Relationship-Enhanced Content Caching Algorithm Using Deep Reinforcement Learning
Mobile edge caching (MEC) is a promising technology to alleviate traffic congestion in the network. Current studies explored deep reinforcement learning (DRL)-based MEC methods. These methods consider the dynamics of the request size to maximize the cache hit rate. However, they usually ignored the potential request relationships among contents. Two contents with a strong relationship are usually requested sequentially. Inspired by this assumption, this paper proposes a relationship-enhanced content caching algorithm using DRL, named RECC. Our RECC infers user preferences by mining the request relationships among contents. In this work, the relationships are modeled as request sequences, and the request features are learned by using graph embedding. These features will be used as input of state in our DRL-based algorithm. We utilize the Wolpertinger architecture to solve the limitation of large discrete action space. The simulation results indicate that our RECC outperformed the traditional cache policies and state-of-the-art DRL-based method in cache hit rate. Furthermore, the proposed RECC has advantages in long-term stability in the environment where content popularity changes dynamically, and also has a higher cache hit rate when handling the requests with number changes dynamically.