Purpose: This paper explores the intersection of cloud computing, generative artificial intelligence (AI), and sustainability, focusing on how these technologies can drive the transition towards net-zero emissions. The aim is to assess how integrating cloud-based solutions and AI can enhance energy efficiency and support environmental goals. Methodology: The study employs a comprehensive review of recent literature, industry reports, and case studies from major cloud providers and technology companies. It analyzes the sustainability commitments of leading cloud providers, evaluates the role of generative AI in optimizing resource utilization, and examines the benefits of serverless automation in reducing carbon footprints. Findings: The research finds that cloud providers are making significant strides in sustainability through ambitious climate commitments and energy-efficient technologies. Generative AI is shown to improve decision-making and resource management, while serverless automation optimizes resource use and minimizes energy consumption. These advancements collectively contribute to achieving long-term sustainability goals and enhancing operational efficiency. Unique Contribution to Theory, Practice, and Policy: This paper provides a novel perspective on leveraging generative AI and serverless automation for sustainability. It offers actionable recommendations for organizations to integrate these technologies into their cloud strategies, emphasizing the importance of aligning with ESG criteria and optimizing data center operations. The findings support the development of policies that encourage technological innovation in the pursuit of environmental stewardship and carbon neutrality.
{"title":"Clouding the Future: Innovating Towards Net-Zero Emissions","authors":"Sridhar Mahadevan","doi":"10.47941/ijce.2127","DOIUrl":"https://doi.org/10.47941/ijce.2127","url":null,"abstract":"Purpose: This paper explores the intersection of cloud computing, generative artificial intelligence (AI), and sustainability, focusing on how these technologies can drive the transition towards net-zero emissions. The aim is to assess how integrating cloud-based solutions and AI can enhance energy efficiency and support environmental goals. \u0000Methodology: The study employs a comprehensive review of recent literature, industry reports, and case studies from major cloud providers and technology companies. It analyzes the sustainability commitments of leading cloud providers, evaluates the role of generative AI in optimizing resource utilization, and examines the benefits of serverless automation in reducing carbon footprints. \u0000Findings: The research finds that cloud providers are making significant strides in sustainability through ambitious climate commitments and energy-efficient technologies. Generative AI is shown to improve decision-making and resource management, while serverless automation optimizes resource use and minimizes energy consumption. These advancements collectively contribute to achieving long-term sustainability goals and enhancing operational efficiency. \u0000Unique Contribution to Theory, Practice, and Policy: This paper provides a novel perspective on leveraging generative AI and serverless automation for sustainability. It offers actionable recommendations for organizations to integrate these technologies into their cloud strategies, emphasizing the importance of aligning with ESG criteria and optimizing data center operations. The findings support the development of policies that encourage technological innovation in the pursuit of environmental stewardship and carbon neutrality.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"87 12","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141797829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: User authentication in distributed systems presents unique challenges due to the decentralized nature of these environments and the potential for high-volume login attempts. This paper proposes an efficient method for UserID existence checking during the login process using Bloom filters, a space-efficient probabilistic data structure. Our approach aims to reduce authentication latency and minimize network traffic while maintaining a high level of security. Methodology: We present a novel system architecture that incorporates Bloom filters at strategic points within the distributed system to perform rapid preliminary checks on UserID existence. This method allows for quick rejection of non-existent UserIDs without querying the main user database, significantly reducing the load on central authentication servers. The paper details the implementation of Bloom filters optimized for UserID storage and lookup, including considerations for filter size, hash function selection, and false positive rate management. We also describe the integration of this method into a typical authentication workflow, highlighting the points at which Bloom filter checks are performed and how they interact with existing security measures. Findings: To evaluate the effectiveness of our approach, we conducted extensive experiments simulating various scales of distributed systems and login attempt patterns. Our results demonstrate that the Bloom filter-based UserID existence checking method reduces authentication latency by an average of 37% compared to traditional database lookup methods. Additionally, we observed a 42% decrease in network traffic related to authentication processes, indicating improved scalability for large-scale distributed systems. The paper also discusses the trade-offs inherent in using probabilistic data structures for security-critical operations, addressing potential vulnerabilities and proposing mitigation strategies. We conclude by outlining future research directions, including adaptive Bloom filter sizing and the potential application of this method to other aspects of distributed system security. Unique Contribution to Theory, Policy and Practice: This research contributes to the field of distributed systems security by providing a practical, efficient, and scalable solution for UserID existence checking, potentially improving the performance and user experience of large-scale authentication systems.
目的:分布式系统中的用户身份验证具有独特的挑战性,因为这些环境具有分散性,而且可能会出现大量登录尝试。本文提出了一种在登录过程中使用布鲁姆过滤器(一种空间效率高的概率数据结构)进行用户 ID 存在性检查的高效方法。我们的方法旨在减少身份验证延迟和网络流量,同时保持高水平的安全性。方法:我们提出了一种新颖的系统架构,该架构在分布式系统的战略点上整合了 Bloom 过滤器,以便对用户 ID 是否存在进行快速初步检查。这种方法可以在不查询主用户数据库的情况下快速拒绝不存在的用户 ID,从而大大减轻中央认证服务器的负担。本文详细介绍了针对用户 ID 存储和查询进行优化的 Bloom 过滤器的实现方法,包括过滤器大小、散列函数选择和误报率管理等方面的注意事项。我们还介绍了如何将这种方法集成到典型的身份验证工作流程中,重点说明了执行 Bloom 过滤器检查的要点,以及它们如何与现有的安全措施相互作用。研究结果为了评估我们的方法的有效性,我们对各种规模的分布式系统和登录尝试模式进行了广泛的模拟实验。实验结果表明,与传统的数据库查询方法相比,基于 Bloom 过滤器的用户 ID 存在性检查方法平均减少了 37% 的验证延迟。此外,我们还观察到与身份验证过程相关的网络流量减少了 42%,这表明大规模分布式系统的可扩展性得到了改善。本文还讨论了使用概率数据结构进行安全关键操作的内在权衡,解决了潜在漏洞并提出了缓解策略。最后,我们概述了未来的研究方向,包括自适应布鲁姆过滤器的大小以及将此方法应用于分布式系统安全其他方面的可能性。对理论、政策和实践的独特贡献:本研究为用户 ID 存在性检查提供了一种实用、高效和可扩展的解决方案,可能会改善大规模身份验证系统的性能和用户体验,从而为分布式系统安全领域做出贡献。
{"title":"Fast and Efficient UserID Lookup in Distributed Authentication: A Probabilistic Approach Using Bloom Filters","authors":"Purshotam S Yadav","doi":"10.47941/ijce.2124","DOIUrl":"https://doi.org/10.47941/ijce.2124","url":null,"abstract":"Purpose: User authentication in distributed systems presents unique challenges due to the decentralized nature of these environments and the potential for high-volume login attempts. This paper proposes an efficient method for UserID existence checking during the login process using Bloom filters, a space-efficient probabilistic data structure. Our approach aims to reduce authentication latency and minimize network traffic while maintaining a high level of security. \u0000Methodology: We present a novel system architecture that incorporates Bloom filters at strategic points within the distributed system to perform rapid preliminary checks on UserID existence. This method allows for quick rejection of non-existent UserIDs without querying the main user database, significantly reducing the load on central authentication servers. The paper details the implementation of Bloom filters optimized for UserID storage and lookup, including considerations for filter size, hash function selection, and false positive rate management. We also describe the integration of this method into a typical authentication workflow, highlighting the points at which Bloom filter checks are performed and how they interact with existing security measures. \u0000Findings: To evaluate the effectiveness of our approach, we conducted extensive experiments simulating various scales of distributed systems and login attempt patterns. Our results demonstrate that the Bloom filter-based UserID existence checking method reduces authentication latency by an average of 37% compared to traditional database lookup methods. Additionally, we observed a 42% decrease in network traffic related to authentication processes, indicating improved scalability for large-scale distributed systems. The paper also discusses the trade-offs inherent in using probabilistic data structures for security-critical operations, addressing potential vulnerabilities and proposing mitigation strategies. We conclude by outlining future research directions, including adaptive Bloom filter sizing and the potential application of this method to other aspects of distributed system security. \u0000Unique Contribution to Theory, Policy and Practice: This research contributes to the field of distributed systems security by providing a practical, efficient, and scalable solution for UserID existence checking, potentially improving the performance and user experience of large-scale authentication systems.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"21 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141807442","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the era of digital transformation and increasing online interactions, customer support is a critical aspect of business success. This paper investigates the development of adaptive customer support chatbots that use real-time sentiment analysis to generate contextually appropriate responses. By leveraging advanced sentiment detection techniques, the system aims to enhance user interaction, satisfaction, and overall customer service experience. This innovation is particularly relevant in today's fast-paced, digitally connected world where personalized and empathetic customer service can significantly impact brand loyalty and customer retention. The proposed approach addresses the growing demand for more intelligent and emotionally aware chatbots, aligning with current trends in artificial intelligence and consumer expectations.
{"title":"Adaptive Chatbots: Real-Time Sentiment Analysis for Customer Support","authors":"Rekha Sivakolundhu, Deepak Nanuru Yagamurthy","doi":"10.47941/ijce.2123","DOIUrl":"https://doi.org/10.47941/ijce.2123","url":null,"abstract":"In the era of digital transformation and increasing online interactions, customer support is a critical aspect of business success. This paper investigates the development of adaptive customer support chatbots that use real-time sentiment analysis to generate contextually appropriate responses. By leveraging advanced sentiment detection techniques, the system aims to enhance user interaction, satisfaction, and overall customer service experience. This innovation is particularly relevant in today's fast-paced, digitally connected world where personalized and empathetic customer service can significantly impact brand loyalty and customer retention. The proposed approach addresses the growing demand for more intelligent and emotionally aware chatbots, aligning with current trends in artificial intelligence and consumer expectations.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"67 26","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141806779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: The whitepaper is a comprehensive guide on global AI regulations, focusing on the EU AI Act as a pivotal framework. It analyses its impact on industries, outlines key guidelines, and compares global AI initiatives. Methodology: The whitepaper employs a detailed analysis of the EU AI Act, examining its stringent rules for AI ethics and safety. It also involves a comparative study of global AI initiatives to understand the broader landscape of AI regulations. Findings: The whitepaper finds that the EU AI Act inspires global standards for AI ethics and safety. It highlights businesses' compliance challenges and the need for responsible innovation to build consumer trust. Unique contribution to theory, practice, and policy (recommendations): Businesses should stay updated on evolving AI laws to ensure ethical AI use. The whitepaper offers insights into the regulatory landscape and practical compliance advice, suggesting that aligning global AI regulations with the strict standards of the EU AI Act could be beneficial
{"title":"Comprehensive Guide to AI Regulations: Analyzing the EU AI Act and Global Initiatives","authors":"Puneet Matai","doi":"10.47941/ijce.2110","DOIUrl":"https://doi.org/10.47941/ijce.2110","url":null,"abstract":"Purpose: The whitepaper is a comprehensive guide on global AI regulations, focusing on the EU AI Act as a pivotal framework. It analyses its impact on industries, outlines key guidelines, and compares global AI initiatives. \u0000Methodology: The whitepaper employs a detailed analysis of the EU AI Act, examining its stringent rules for AI ethics and safety. It also involves a comparative study of global AI initiatives to understand the broader landscape of AI regulations. \u0000Findings: The whitepaper finds that the EU AI Act inspires global standards for AI ethics and safety. It highlights businesses' compliance challenges and the need for responsible innovation to build consumer trust. \u0000Unique contribution to theory, practice, and policy (recommendations): Businesses should stay updated on evolving AI laws to ensure ethical AI use. The whitepaper offers insights into the regulatory landscape and practical compliance advice, suggesting that aligning global AI regulations with the strict standards of the EU AI Act could be beneficial","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"212 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141828274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: The general objective of this study was to explore cybersecurity frameworks for cloud computing environments. Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. Findings: The findings reveal that there exists a contextual and methodological gap relating to explore cybersecurity frameworks for cloud computing environments. The study emphasized the necessity of robust, comprehensive security measures to address the unique challenges of cloud infrastructures. It highlighted the importance of advanced security measures like encryption, multi-factor authentication, and continuous monitoring to mitigate risks. The research underscored the need for holistic and adaptable frameworks that integrate technological solutions and human factors, while also stressing regulatory compliance. The findings had significant implications for cloud service providers, businesses, regulatory bodies, and cybersecurity professionals, suggesting a focus on new technologies like AI and blockchain for future research. Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Technology Acceptance Model (ATM) and Socio-Technical Systems Theory may be used to anchor future studies on cybersecurity frameworks for cloud computing environments. The study made significant theoretical, practical, and policy recommendations. It emphasized the need for an integrated theoretical approach, the adoption of multi-layered security practices, and regular security assessments. The study also advocated for standardized and specific regulatory frameworks tailored to cloud environments and international cooperation for consistent global cybersecurity policies. These recommendations aimed to enhance the understanding, implementation, and governance of cloud security, ultimately contributing to a more resilient and secure cloud computing ecosystem.
{"title":"Cybersecurity Frameworks for Cloud Computing Environments","authors":"Elizabeth Shelly","doi":"10.47941/ijce.2058","DOIUrl":"https://doi.org/10.47941/ijce.2058","url":null,"abstract":"Purpose: The general objective of this study was to explore cybersecurity frameworks for cloud computing environments. \u0000Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. \u0000Findings: The findings reveal that there exists a contextual and methodological gap relating to explore cybersecurity frameworks for cloud computing environments. The study emphasized the necessity of robust, comprehensive security measures to address the unique challenges of cloud infrastructures. It highlighted the importance of advanced security measures like encryption, multi-factor authentication, and continuous monitoring to mitigate risks. The research underscored the need for holistic and adaptable frameworks that integrate technological solutions and human factors, while also stressing regulatory compliance. The findings had significant implications for cloud service providers, businesses, regulatory bodies, and cybersecurity professionals, suggesting a focus on new technologies like AI and blockchain for future research. \u0000Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Technology Acceptance Model (ATM) and Socio-Technical Systems Theory may be used to anchor future studies on cybersecurity frameworks for cloud computing environments. The study made significant theoretical, practical, and policy recommendations. It emphasized the need for an integrated theoretical approach, the adoption of multi-layered security practices, and regular security assessments. The study also advocated for standardized and specific regulatory frameworks tailored to cloud environments and international cooperation for consistent global cybersecurity policies. These recommendations aimed to enhance the understanding, implementation, and governance of cloud security, ultimately contributing to a more resilient and secure cloud computing ecosystem.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"50 12","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141654660","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: The general objective of this study was to examine Software-Defined Networking (SDN) for efficient network management. Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. Findings: The findings reveal that there exists a contextual and methodological gap relating to Software-Defined Networking (SDN) for efficient network management. Preliminary empirical review revealed that SDN offered significant advantages in enhancing network agility, scalability, and operational efficiency. By centralizing network management functions and abstracting network control, SDN enabled dynamic resource allocation and optimized traffic flows. However, challenges such as security vulnerabilities, interoperability issues, and the need for specialized skills were identified. Successful SDN implementation required careful planning, rigorous testing, and strategic integration with existing IT infrastructures. Future research recommendations included further exploration of SDN technologies, evaluation of their impact on network performance and security, and the development of best practices for deployment and management to maximize benefits. Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Technology Acceptance Model (TAM) and Resource Based View (RBV) Theory may be used to anchor future studies on Software-Defined Networking (SDN). The recommendations drawn from the study on Software-Defined Networking (SDN) for Efficient Network Management focused on enhancing theoretical frameworks, improving practical implementations, informing policy development, promoting industry collaboration, addressing security concerns, and facilitating stakeholder engagement. These initiatives aimed to strengthen SDN adoption and implementation by refining theoretical models, advocating for supportive policy environments, fostering industry partnerships, addressing security challenges, and engaging stakeholders throughout the deployment process. By integrating these strategies, the study sought to optimize network management efficiency and promote sustainable technological advancements in SDN.
目的:本研究的总体目标是研究软件定义网络(SDN),以实现高效的网络管理。研究方法:本研究采用桌面研究方法。案头研究指的是二手数据或无需实地考察即可收集到的数据。案头研究基本上是从现有资源中收集数据,因此与实地研究相比,案头研究通常被认为是一种低成本技术,因为主要成本涉及管理人员的时间、电话费和目录。因此,本研究依赖于已出版的研究、报告和统计数据。这些二手数据可通过在线期刊和图书馆轻松获取。研究结果:研究结果表明,在软件定义网络(SDN)用于高效网络管理方面存在背景和方法上的差距。初步实证审查显示,SDN 在提高网络敏捷性、可扩展性和运营效率方面具有显著优势。通过集中网络管理功能和抽象网络控制,SDN 实现了动态资源分配和优化流量。然而,安全漏洞、互操作性问题和专业技能需求等挑战也随之而来。要成功实施 SDN,需要精心规划、严格测试,并与现有 IT 基础设施进行战略性整合。未来的研究建议包括进一步探索 SDN 技术,评估其对网络性能和安全性的影响,以及开发部署和管理的最佳实践,以实现效益最大化。对理论、实践和政策的独特贡献:创新扩散理论、技术接受模型 (TAM) 和基于资源的观点 (RBV) 理论可用于今后对软件定义网络 (SDN) 的研究。从 "软件定义网络(SDN)促进高效网络管理 "研究中得出的建议侧重于加强理论框架、改进实际实施、为政策制定提供信息、促进行业合作、解决安全问题以及促进利益相关者的参与。这些举措旨在通过完善理论模型、倡导支持性政策环境、促进行业合作、应对安全挑战以及让利益相关者参与整个部署过程,来加强 SDN 的采用和实施。通过整合这些战略,本研究旨在优化网络管理效率,促进 SDN 的可持续技术进步。
{"title":"Software-Defined Networking (SDN) for Efficient Network Management","authors":"Binti Shalom","doi":"10.47941/ijce.2056","DOIUrl":"https://doi.org/10.47941/ijce.2056","url":null,"abstract":"Purpose: The general objective of this study was to examine Software-Defined Networking (SDN) for efficient network management. \u0000Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. \u0000Findings: The findings reveal that there exists a contextual and methodological gap relating to Software-Defined Networking (SDN) for efficient network management. Preliminary empirical review revealed that SDN offered significant advantages in enhancing network agility, scalability, and operational efficiency. By centralizing network management functions and abstracting network control, SDN enabled dynamic resource allocation and optimized traffic flows. However, challenges such as security vulnerabilities, interoperability issues, and the need for specialized skills were identified. Successful SDN implementation required careful planning, rigorous testing, and strategic integration with existing IT infrastructures. Future research recommendations included further exploration of SDN technologies, evaluation of their impact on network performance and security, and the development of best practices for deployment and management to maximize benefits. \u0000Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Technology Acceptance Model (TAM) and Resource Based View (RBV) Theory may be used to anchor future studies on Software-Defined Networking (SDN). The recommendations drawn from the study on Software-Defined Networking (SDN) for Efficient Network Management focused on enhancing theoretical frameworks, improving practical implementations, informing policy development, promoting industry collaboration, addressing security concerns, and facilitating stakeholder engagement. These initiatives aimed to strengthen SDN adoption and implementation by refining theoretical models, advocating for supportive policy environments, fostering industry partnerships, addressing security challenges, and engaging stakeholders throughout the deployment process. By integrating these strategies, the study sought to optimize network management efficiency and promote sustainable technological advancements in SDN.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"86 12","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141652921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: This study sought to explore big data analytics for smart cities. Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. Findings: The findings reveal that there exists a contextual and methodological gap relating to exploring big data analytics for smart cities. The integration of big data analytics into smart city operations significantly improved urban management efficiency, sustainability, and residents' quality of life. By leveraging advanced analytics, cities optimized traffic flow, reduced energy consumption, enhanced public safety, improved healthcare delivery, and monitored environmental conditions in real-time. These advancements led to smoother services, economic sustainability, better public safety, effective disaster management, and proactive environmental and health interventions, making cities more responsive, resilient, and sustainable. Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Socio-Technical Systems Theory and Actor- Network Theory may be used to anchor future studies on big data analytics for smart cities. The study made significant contributions to theory, practice, and policy by extending the Diffusion of Innovations and Socio-Technical Systems theories with empirical evidence, recommending robust data governance frameworks and skilled analytics units for practical implementation, and advocating for comprehensive policies to ensure data privacy and security. It highlighted the importance of stakeholder collaboration, investment in technological infrastructure, and future research on long-term impacts, ethical considerations, and emerging technologies to enhance the efficiency and sustainability of smart cities.
{"title":"Big Data Analytics for Smart Cities","authors":"Cassie Davies","doi":"10.47941/ijce.2057","DOIUrl":"https://doi.org/10.47941/ijce.2057","url":null,"abstract":"Purpose: This study sought to explore big data analytics for smart cities. \u0000Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. \u0000Findings: The findings reveal that there exists a contextual and methodological gap relating to exploring big data analytics for smart cities. The integration of big data analytics into smart city operations significantly improved urban management efficiency, sustainability, and residents' quality of life. By leveraging advanced analytics, cities optimized traffic flow, reduced energy consumption, enhanced public safety, improved healthcare delivery, and monitored environmental conditions in real-time. These advancements led to smoother services, economic sustainability, better public safety, effective disaster management, and proactive environmental and health interventions, making cities more responsive, resilient, and sustainable. \u0000Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Socio-Technical Systems Theory and Actor- Network Theory may be used to anchor future studies on big data analytics for smart cities. The study made significant contributions to theory, practice, and policy by extending the Diffusion of Innovations and Socio-Technical Systems theories with empirical evidence, recommending robust data governance frameworks and skilled analytics units for practical implementation, and advocating for comprehensive policies to ensure data privacy and security. It highlighted the importance of stakeholder collaboration, investment in technological infrastructure, and future research on long-term impacts, ethical considerations, and emerging technologies to enhance the efficiency and sustainability of smart cities.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"8 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141655617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: The general objective of this study was to investigate wearable technology for health monitoring and diagnostics. Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. Findings: The findings reveal that there exists a contextual and methodological gap relating to wearable technology for health monitoring and diagnostics. The advancement of wearable technology in health monitoring and diagnostics transformed the healthcare landscape by enabling continuous, real-time data collection and analysis, empowering individuals to manage their health proactively. Despite its benefits, challenges such as data privacy, device accuracy, and user adherence needed addressing. Ensuring robust data protection, validating device accuracy in diverse environments, and understanding barriers to sustained use were essential. Addressing the digital divide was also vital for equitable access. Overall, wearable technology held significant promise for preventive care and early diagnosis, but required ongoing research and collaboration to maximize its impact. Unique Contribution to Theory, Practice and Policy: The Technology Acceptance Model (TAM), Health Belief Model (HBM) and Unified Theory of Acceptance and Use of Technology (UTAUT) may be used to anchor future studies on wearable technology for health monitoring and diagnostics. The study recommended integrating technology acceptance and health behavior theories to provide a comprehensive understanding of adoption factors, emphasizing user-centered design for enhanced engagement, and advocating for stringent data privacy and security standards. It highlighted the importance of integrating wearable technology into healthcare systems for better clinical decisions, promoting equitable access, and using wearables in public health initiatives. The study also called for collaboration between technology developers, healthcare providers, and policymakers to address challenges and maximize the benefits of wearable health technology.
{"title":"Wearable Technology for Health Monitoring and Diagnostics","authors":"Sara Boyd","doi":"10.47941/ijce.2041","DOIUrl":"https://doi.org/10.47941/ijce.2041","url":null,"abstract":"Purpose: The general objective of this study was to investigate wearable technology for health monitoring and diagnostics. \u0000Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. \u0000Findings: The findings reveal that there exists a contextual and methodological gap relating to wearable technology for health monitoring and diagnostics. The advancement of wearable technology in health monitoring and diagnostics transformed the healthcare landscape by enabling continuous, real-time data collection and analysis, empowering individuals to manage their health proactively. Despite its benefits, challenges such as data privacy, device accuracy, and user adherence needed addressing. Ensuring robust data protection, validating device accuracy in diverse environments, and understanding barriers to sustained use were essential. Addressing the digital divide was also vital for equitable access. Overall, wearable technology held significant promise for preventive care and early diagnosis, but required ongoing research and collaboration to maximize its impact. \u0000Unique Contribution to Theory, Practice and Policy: The Technology Acceptance Model (TAM), Health Belief Model (HBM) and Unified Theory of Acceptance and Use of Technology (UTAUT) may be used to anchor future studies on wearable technology for health monitoring and diagnostics. The study recommended integrating technology acceptance and health behavior theories to provide a comprehensive understanding of adoption factors, emphasizing user-centered design for enhanced engagement, and advocating for stringent data privacy and security standards. It highlighted the importance of integrating wearable technology into healthcare systems for better clinical decisions, promoting equitable access, and using wearables in public health initiatives. The study also called for collaboration between technology developers, healthcare providers, and policymakers to address challenges and maximize the benefits of wearable health technology.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"35 25","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141659219","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: The study sought to explore the impact of edge computing on real-time data processing. Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. Findings: The findings reveal that there exists a contextual and methodological gap relating to the impact of edge computing on real-time data processing. Preliminary empirical review reveled that edge computing significantly reduced latency and enhanced efficiency in real-time data processing across various industries by bringing computational resources closer to data sources. It highlighted the technology's ability to handle large volumes of IoT-generated data, improve security by localizing data processing, and drive innovation and economic growth through new applications and services. Edge computing's decentralized approach proved essential for reliable and robust data handling, particularly in critical sectors like healthcare and finance, ultimately solidifying its importance in the digital transformation landscape. Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Resource-Based View (RBV) and Sociotechnical Systems Theory may be used to anchor future studies on edge computing on real-time data processing. The study recommended expanding theoretical frameworks to include the unique aspects of edge computing, investing in robust edge infrastructure, and developing standardized protocols and best practices. It emphasized the need for government incentives and supportive regulatory frameworks to promote adoption, and suggested that academic institutions incorporate edge computing into curricula. Additionally, the study called for ongoing research to address emerging challenges and opportunities, ensuring continuous advancement and effective implementation of edge computing technologies.
{"title":"The Impact of Edge Computing on Real-Time Data Processing","authors":"Brian Kelly","doi":"10.47941/ijce.2042","DOIUrl":"https://doi.org/10.47941/ijce.2042","url":null,"abstract":"Purpose: The study sought to explore the impact of edge computing on real-time data processing. \u0000Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a low cost technique as compared to field research, as the main cost is involved in executive’s time, telephone charges and directories. Thus, the study relied on already published studies, reports and statistics. This secondary data was easily accessed through the online journals and library. \u0000Findings: The findings reveal that there exists a contextual and methodological gap relating to the impact of edge computing on real-time data processing. Preliminary empirical review reveled that edge computing significantly reduced latency and enhanced efficiency in real-time data processing across various industries by bringing computational resources closer to data sources. It highlighted the technology's ability to handle large volumes of IoT-generated data, improve security by localizing data processing, and drive innovation and economic growth through new applications and services. Edge computing's decentralized approach proved essential for reliable and robust data handling, particularly in critical sectors like healthcare and finance, ultimately solidifying its importance in the digital transformation landscape. \u0000Unique Contribution to Theory, Practice and Policy: The Diffusion of Innovations Theory, Resource-Based View (RBV) and Sociotechnical Systems Theory may be used to anchor future studies on edge computing on real-time data processing. The study recommended expanding theoretical frameworks to include the unique aspects of edge computing, investing in robust edge infrastructure, and developing standardized protocols and best practices. It emphasized the need for government incentives and supportive regulatory frameworks to promote adoption, and suggested that academic institutions incorporate edge computing into curricula. Additionally, the study called for ongoing research to address emerging challenges and opportunities, ensuring continuous advancement and effective implementation of edge computing technologies.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"25 12","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141660712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose: This research note provides some practical advice on accessing financial reports on the internet and the collection of associated data for subsequent analysis. Methodology: Potential methodological problems, such as biases or conflicts of interest, should be acknowledged and strategies for addressing them should be outlined. Findings: Additionally, the confidentiality and privacy of individuals or companies whose financial data is being collected must be protected. Unique contribution to theory, practice, and policy: The guidance on accessing and collecting financial data online contributes to both the theoretical framework of financial analysis and practical applications for researchers and practitioners, ensuring ethical standards are maintained.
{"title":"Ethical Considerations in the Collection and Handling of Financial Data in ETC","authors":"Pankaj Lembhe Pankaj Lembhe","doi":"10.47941/ijce.1823","DOIUrl":"https://doi.org/10.47941/ijce.1823","url":null,"abstract":"Purpose: This research note provides some practical advice on accessing financial reports on the internet and the collection of associated data for subsequent analysis. \u0000Methodology: Potential methodological problems, such as biases or conflicts of interest, should be acknowledged and strategies for addressing them should be outlined. \u0000Findings: Additionally, the confidentiality and privacy of individuals or companies whose financial data is being collected must be protected. \u0000Unique contribution to theory, practice, and policy: The guidance on accessing and collecting financial data online contributes to both the theoretical framework of financial analysis and practical applications for researchers and practitioners, ensuring ethical standards are maintained.","PeriodicalId":198033,"journal":{"name":"International Journal of Computing and Engineering","volume":"41 19","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140662570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}