Parichehr Vahidinia, Bahareh J. Farahani, F. S. Aliee
{"title":"Cold Start in Serverless Computing: Current Trends and Mitigation Strategies","authors":"Parichehr Vahidinia, Bahareh J. Farahani, F. S. Aliee","doi":"10.1109/COINS49042.2020.9191377","DOIUrl":null,"url":null,"abstract":"Serverless Computing is the latest cloud computing model, which facilitates application development. By adopting and leveraging the modern paradigm of Serverless Computing, developers do not need to manage the servers. In this computational model, the executables are independent functions that are individually deployed on a Serverless platform offering instant per-request elasticity. Such elasticity typically comes at the cost of the “Cold Starts” problem. This phenomenon is associated with a delay occurring due to provision a runtime container to execute the functions. Shortly after Amazon introduced this computing model with the AWS Lambda platform in 2014, several open source and commercial platforms also started embracing and offering this technology. Each platform has its own solution to deal with Cold Starts. The evaluation of the performance of each platform under the load and factors influencing the cold start problem has received much attention over the past few years. This paper provides a comprehensive overview on the recent advancements and state-of-the-art works in mitigating the cold start delay. Moreover, several sets of experiments have been performed to study the behavior of the AWS Lambda as the base platform with respect to the cold start delay.","PeriodicalId":350108,"journal":{"name":"2020 International Conference on Omni-layer Intelligent Systems (COINS)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"32","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Omni-layer Intelligent Systems (COINS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COINS49042.2020.9191377","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 32
Abstract
Serverless Computing is the latest cloud computing model, which facilitates application development. By adopting and leveraging the modern paradigm of Serverless Computing, developers do not need to manage the servers. In this computational model, the executables are independent functions that are individually deployed on a Serverless platform offering instant per-request elasticity. Such elasticity typically comes at the cost of the “Cold Starts” problem. This phenomenon is associated with a delay occurring due to provision a runtime container to execute the functions. Shortly after Amazon introduced this computing model with the AWS Lambda platform in 2014, several open source and commercial platforms also started embracing and offering this technology. Each platform has its own solution to deal with Cold Starts. The evaluation of the performance of each platform under the load and factors influencing the cold start problem has received much attention over the past few years. This paper provides a comprehensive overview on the recent advancements and state-of-the-art works in mitigating the cold start delay. Moreover, several sets of experiments have been performed to study the behavior of the AWS Lambda as the base platform with respect to the cold start delay.