Shakthi Weerasinghe, A. Zaslavsky, S. Loke, A. Abken, A. Hassani, A. Medvedev
{"title":"Adaptive Context Caching for Efficient Distributed Context Management Systems","authors":"Shakthi Weerasinghe, A. Zaslavsky, S. Loke, A. Abken, A. Hassani, A. Medvedev","doi":"10.1145/3555776.3577602","DOIUrl":null,"url":null,"abstract":"We contend that performance metrics-driven adaptive context caching has a profound impact on performance efficiency in distributed context management systems (CMS). This paper proposes an adaptive context caching approach based on (i) a model of economics-inspired expected returns of caching particular items, and (ii) learning from historical context caching performance, i.e., our approach adaptively (with respect to statistics on historical performance) caches \"context\" with the objective of minimizing the cost incurred by a CMS in responding to context queries. Our novel algorithm enables context queries and sub-queries to reuse and repurpose cached context in an efficient manner, different from traditional data caching. The paper also proposes heuristics and adaptive policies such as eviction and context cache memory scaling. The method is evaluated using a synthetically generated load of sub-queries inspired by a real-world scenario. We further investigate optimal adaptive caching configurations under different settings. This paper presents and discusses our findings that the proposed statistical selective caching method reaches short-term cost optimality fast under massively volatile queries. The proposed method outperforms related algorithms by up to 47.9% in cost efficiency.","PeriodicalId":42971,"journal":{"name":"Applied Computing Review","volume":null,"pages":null},"PeriodicalIF":0.4000,"publicationDate":"2023-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Computing Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3555776.3577602","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 5
Abstract
We contend that performance metrics-driven adaptive context caching has a profound impact on performance efficiency in distributed context management systems (CMS). This paper proposes an adaptive context caching approach based on (i) a model of economics-inspired expected returns of caching particular items, and (ii) learning from historical context caching performance, i.e., our approach adaptively (with respect to statistics on historical performance) caches "context" with the objective of minimizing the cost incurred by a CMS in responding to context queries. Our novel algorithm enables context queries and sub-queries to reuse and repurpose cached context in an efficient manner, different from traditional data caching. The paper also proposes heuristics and adaptive policies such as eviction and context cache memory scaling. The method is evaluated using a synthetically generated load of sub-queries inspired by a real-world scenario. We further investigate optimal adaptive caching configurations under different settings. This paper presents and discusses our findings that the proposed statistical selective caching method reaches short-term cost optimality fast under massively volatile queries. The proposed method outperforms related algorithms by up to 47.9% in cost efficiency.