混合云中基于文件预选和缓存预取的LRU算法优化

Shumeng Du, Chunlin Li, XiJun Mao, Wei Yan
{"title":"混合云中基于文件预选和缓存预取的LRU算法优化","authors":"Shumeng Du, Chunlin Li, XiJun Mao, Wei Yan","doi":"10.1109/PDCAT.2016.039","DOIUrl":null,"url":null,"abstract":"In recent years, the research on caching in cloud environment has become an important research topic, and it has profound meaning to research the cache replacement algorithm in hybrid Cloud. There aren't enough considerations on some aspects, such as the selection of pending cache files, the prefetching of pending cache files among different clouds and the cost of recovery of files. Considering those shortages, this paper proposes an optimized LRU algorithm based on pre-selection and cache prefetching of files. This algorithm determines whether the file is to meet the pre-selection and cache prefetching conditions before adding a cache file, and it implements the LRU cache replacement algorithm which is based on priority. The algorithm divides the cache into multiple priority queues, and uses the LRU cache replacement algorithm to select the replacement file in each queue. Then select the files in each priority and put them together, select the file to perform replacement operation which has minimum probability of being accessed again. Compared with three typical cache replacement algorithm GD-Size, LRU, LFU, experimental results show that the cache replacement algorithm in this paper not only effectively save cost, but also greatly enhance the byte hit rate, delay savings rate and cache hit rate.","PeriodicalId":203925,"journal":{"name":"2016 17th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"The Optimization of LRU Algorithm Based on Pre-Selection and Cache Prefetching of Files in Hybrid Cloud\",\"authors\":\"Shumeng Du, Chunlin Li, XiJun Mao, Wei Yan\",\"doi\":\"10.1109/PDCAT.2016.039\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, the research on caching in cloud environment has become an important research topic, and it has profound meaning to research the cache replacement algorithm in hybrid Cloud. There aren't enough considerations on some aspects, such as the selection of pending cache files, the prefetching of pending cache files among different clouds and the cost of recovery of files. Considering those shortages, this paper proposes an optimized LRU algorithm based on pre-selection and cache prefetching of files. This algorithm determines whether the file is to meet the pre-selection and cache prefetching conditions before adding a cache file, and it implements the LRU cache replacement algorithm which is based on priority. The algorithm divides the cache into multiple priority queues, and uses the LRU cache replacement algorithm to select the replacement file in each queue. Then select the files in each priority and put them together, select the file to perform replacement operation which has minimum probability of being accessed again. Compared with three typical cache replacement algorithm GD-Size, LRU, LFU, experimental results show that the cache replacement algorithm in this paper not only effectively save cost, but also greatly enhance the byte hit rate, delay savings rate and cache hit rate.\",\"PeriodicalId\":203925,\"journal\":{\"name\":\"2016 17th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 17th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PDCAT.2016.039\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 17th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PDCAT.2016.039","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

近年来,对云环境下缓存的研究已成为一个重要的研究课题,研究混合云下的缓存替换算法具有深远的意义。在挂起缓存文件的选择、不同云间挂起缓存文件的预取、文件的恢复成本等方面没有足够的考虑。针对这些不足,本文提出了一种基于文件预选和缓存预取的优化LRU算法。该算法在添加缓存文件前判断文件是否满足预选和预取缓存条件,实现基于优先级的LRU缓存替换算法。该算法将缓存划分为多个优先级队列,并使用LRU缓存替换算法在每个队列中选择替换文件。然后将各优先级的文件选择在一起,选择被再次访问概率最小的文件进行替换操作。实验结果表明,与GD-Size、LRU、LFU三种典型的缓存替换算法相比,本文的缓存替换算法不仅有效地节省了成本,而且大大提高了字节命中率、延迟节省率和缓存命中率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The Optimization of LRU Algorithm Based on Pre-Selection and Cache Prefetching of Files in Hybrid Cloud
In recent years, the research on caching in cloud environment has become an important research topic, and it has profound meaning to research the cache replacement algorithm in hybrid Cloud. There aren't enough considerations on some aspects, such as the selection of pending cache files, the prefetching of pending cache files among different clouds and the cost of recovery of files. Considering those shortages, this paper proposes an optimized LRU algorithm based on pre-selection and cache prefetching of files. This algorithm determines whether the file is to meet the pre-selection and cache prefetching conditions before adding a cache file, and it implements the LRU cache replacement algorithm which is based on priority. The algorithm divides the cache into multiple priority queues, and uses the LRU cache replacement algorithm to select the replacement file in each queue. Then select the files in each priority and put them together, select the file to perform replacement operation which has minimum probability of being accessed again. Compared with three typical cache replacement algorithm GD-Size, LRU, LFU, experimental results show that the cache replacement algorithm in this paper not only effectively save cost, but also greatly enhance the byte hit rate, delay savings rate and cache hit rate.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Learning-Based System for Monitoring Electrical Load in Smart Grid A Domain-Independent Hybrid Approach for Automatic Taxonomy Induction CUDA-Based Parallel Implementation of IBM Word Alignment Algorithm for Statistical Machine Translation Optimal Scheduling Algorithm of MapReduce Tasks Based on QoS in the Hybrid Cloud Pre-Impact Fall Detection Based on Wearable Device Using Dynamic Threshold Model
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1