Anh-Tien Tran, D. Lakew, The-Vi Nguyen, Van-Dat Tuong, Thanh Phung Truong, Nhu-Ngoc Dao, Sungrae Cho
{"title":"Hit Ratio and Latency Optimization for Caching Systems: A Survey","authors":"Anh-Tien Tran, D. Lakew, The-Vi Nguyen, Van-Dat Tuong, Thanh Phung Truong, Nhu-Ngoc Dao, Sungrae Cho","doi":"10.1109/ICOIN50884.2021.9334019","DOIUrl":null,"url":null,"abstract":"The rise of fifth-generation (5G) communication systems allows the super high-quality services to be implemented in real-life; however, it requires a massive amount of mobile data traffic to be simultaneously transmitted and processed. Fortunately, a significant percentage of mobile data traffic is indeed reusable and should be cached properly in somewhere, and then be delivered back to users’ equipment (UEs) in the future requests. To proactively utilize this nature of content distribution, the caching techniques have attracted significant attention from the research community by alleviating unnecessary duplicated data transmission of popular content in mobile edge caching enabled networks. As a result, numerous scientific approaches under different perspectives have been published and hence should be categorized through specific criteria. In this study, we systematically and extensively survey the most recent caching techniques that were published. For each caching policy, we critically analyze its target in detail by performance metrics, including hit ratio, latency, and storage efficiency. Besides, we display the current trend by sorting them into common technical classes such as machine learning, deep learning, game theory, optimization techniques, etc. To visualize and predict the application of caching algorithms, in reality, we summarize their typical use cases.","PeriodicalId":6741,"journal":{"name":"2021 International Conference on Information Networking (ICOIN)","volume":"38 1","pages":"577-581"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Information Networking (ICOIN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOIN50884.2021.9334019","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
The rise of fifth-generation (5G) communication systems allows the super high-quality services to be implemented in real-life; however, it requires a massive amount of mobile data traffic to be simultaneously transmitted and processed. Fortunately, a significant percentage of mobile data traffic is indeed reusable and should be cached properly in somewhere, and then be delivered back to users’ equipment (UEs) in the future requests. To proactively utilize this nature of content distribution, the caching techniques have attracted significant attention from the research community by alleviating unnecessary duplicated data transmission of popular content in mobile edge caching enabled networks. As a result, numerous scientific approaches under different perspectives have been published and hence should be categorized through specific criteria. In this study, we systematically and extensively survey the most recent caching techniques that were published. For each caching policy, we critically analyze its target in detail by performance metrics, including hit ratio, latency, and storage efficiency. Besides, we display the current trend by sorting them into common technical classes such as machine learning, deep learning, game theory, optimization techniques, etc. To visualize and predict the application of caching algorithms, in reality, we summarize their typical use cases.