首页 > 最新文献

Future Generation Computer Systems-The International Journal of Escience最新文献

英文 中文
ROPGMN: Effective ROP and variants discovery using dynamic feature and graph matching network ROPGMN:利用动态特征和图匹配网络有效发现 ROP 和变体
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-22 DOI: 10.1016/j.future.2024.107567
Weina Niu , Kexuan Zhang , Ran Yan , Jie Li , Yan Zhang , Xiaosong Zhang
Return Oriented Programming (ROP) is one of the most challenging threats to operating systems. Traditional detection and defense techniques for ROP such as stack protection, address randomization, compiler optimization, control flow integrity, and basic block thresholds have certain limitations in accuracy or efficiency. At the same time, they cannot effectively detect ROP variant attacks, such as COP, COOP, JOP. In this paper, we propose a novel ROP and its variants detection approach that first filters the normal execution flow according to four strategies provided and then adopts Graph Matching Network (GMN) to determine whether there is ROP or its variant attack. Moreover, we developed a prototype named ROPGMN with shared memory to solve cross-language and cross-process problems. Using real-world vulnerable programs and constructed programs with dangerous function calls, we conduct extensive experiments with 6 ROP detectors to evaluate ROPGMN. The experimental results demonstrate the effectiveness of ROPGMN in discovering ROPs and their variant attacks with low performance overhead.
面向返回编程(ROP)是操作系统面临的最具挑战性的威胁之一。传统的 ROP 检测和防御技术,如堆栈保护、地址随机化、编译器优化、控制流完整性和基本块阈值等,在准确性或效率上都有一定的局限性。同时,它们也无法有效检测 ROP 变体攻击,如 COP、COOP、JOP。本文提出了一种新的 ROP 及其变种检测方法,首先根据提供的四种策略过滤正常执行流,然后采用图形匹配网络(GMN)来判断是否存在 ROP 或其变种攻击。此外,我们还开发了一种名为 ROPGMN 的共享内存原型,用于解决跨语言和跨进程问题。我们使用真实世界中的脆弱程序和带有危险函数调用的构造程序,用 6 种 ROP 检测器进行了大量实验,以评估 ROPGMN。实验结果表明,ROPGMN 能以较低的性能开销有效地发现 ROP 及其变体攻击。
{"title":"ROPGMN: Effective ROP and variants discovery using dynamic feature and graph matching network","authors":"Weina Niu ,&nbsp;Kexuan Zhang ,&nbsp;Ran Yan ,&nbsp;Jie Li ,&nbsp;Yan Zhang ,&nbsp;Xiaosong Zhang","doi":"10.1016/j.future.2024.107567","DOIUrl":"10.1016/j.future.2024.107567","url":null,"abstract":"<div><div>Return Oriented Programming (ROP) is one of the most challenging threats to operating systems. Traditional detection and defense techniques for ROP such as stack protection, address randomization, compiler optimization, control flow integrity, and basic block thresholds have certain limitations in accuracy or efficiency. At the same time, they cannot effectively detect ROP variant attacks, such as COP, COOP, JOP. In this paper, we propose a novel ROP and its variants detection approach that first filters the normal execution flow according to four strategies provided and then adopts Graph Matching Network (GMN) to determine whether there is ROP or its variant attack. Moreover, we developed a prototype named ROPGMN with shared memory to solve cross-language and cross-process problems. Using real-world vulnerable programs and constructed programs with dangerous function calls, we conduct extensive experiments with 6 ROP detectors to evaluate ROPGMN. The experimental results demonstrate the effectiveness of ROPGMN in discovering ROPs and their variant attacks with low performance overhead.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107567"},"PeriodicalIF":6.2,"publicationDate":"2024-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142560681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Smart contract languages: A comparative analysis 智能合约语言:比较分析
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-22 DOI: 10.1016/j.future.2024.107563
Massimo Bartoletti , Lorenzo Benetollo , Michele Bugliesi , Silvia Crafa , Giacomo Dal Sasso , Roberto Pettinau , Andrea Pinna , Mattia Piras , Sabina Rossi , Stefano Salis , Alvise Spanò , Viacheslav Tkachenko , Roberto Tonelli , Roberto Zunino
Smart contracts have played a pivotal role in the evolution of blockchains and Decentralized Applications (DApps). As DApps continue to gain widespread adoption, multiple smart contract languages have been and are being made available to developers, each with its distinctive features, strengths, and weaknesses. In this paper, we examine the smart contract languages used in major blockchain platforms, with the goal of providing a comprehensive assessment of their main properties. Our analysis targets the programming languages rather than the underlying architecture: as a result, while we do consider the interplay between language design and blockchain model, our main focus remains on language-specific features such as usability, programming style, safety and security. To conduct our assessment, we propose an original benchmark which encompasses a wide, yet manageable, spectrum of key use cases that cut across all the smart contract languages under examination.
智能合约在区块链和去中心化应用程序(DApps)的发展过程中起到了举足轻重的作用。随着 DApps 不断得到广泛应用,开发人员已经可以使用多种智能合约语言,每种语言都有其独特的功能、优点和缺点。在本文中,我们研究了主要区块链平台中使用的智能合约语言,目的是对其主要特性进行全面评估。我们的分析以编程语言而非底层架构为目标:因此,虽然我们确实考虑了语言设计与区块链模型之间的相互作用,但我们的主要重点仍然是语言的特定功能,如可用性、编程风格、安全性和保障性。为了进行评估,我们提出了一个独创的基准,该基准涵盖了广泛但易于管理的关键用例,这些用例横跨所有正在研究的智能合约语言。
{"title":"Smart contract languages: A comparative analysis","authors":"Massimo Bartoletti ,&nbsp;Lorenzo Benetollo ,&nbsp;Michele Bugliesi ,&nbsp;Silvia Crafa ,&nbsp;Giacomo Dal Sasso ,&nbsp;Roberto Pettinau ,&nbsp;Andrea Pinna ,&nbsp;Mattia Piras ,&nbsp;Sabina Rossi ,&nbsp;Stefano Salis ,&nbsp;Alvise Spanò ,&nbsp;Viacheslav Tkachenko ,&nbsp;Roberto Tonelli ,&nbsp;Roberto Zunino","doi":"10.1016/j.future.2024.107563","DOIUrl":"10.1016/j.future.2024.107563","url":null,"abstract":"<div><div>Smart contracts have played a pivotal role in the evolution of blockchains and Decentralized Applications (DApps). As DApps continue to gain widespread adoption, multiple smart contract languages have been and are being made available to developers, each with its distinctive features, strengths, and weaknesses. In this paper, we examine the smart contract languages used in major blockchain platforms, with the goal of providing a comprehensive assessment of their main properties. Our analysis targets the programming languages rather than the underlying architecture: as a result, while we do consider the interplay between language design and blockchain model, our main focus remains on language-specific features such as usability, programming style, safety and security. To conduct our assessment, we propose an original benchmark which encompasses a wide, yet manageable, spectrum of key use cases that cut across all the smart contract languages under examination.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107563"},"PeriodicalIF":6.2,"publicationDate":"2024-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142554218","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hybrid wind speed optimization forecasting system based on linear and nonlinear deep neural network structure and data preprocessing fusion 基于线性和非线性深度神经网络结构及数据预处理融合的混合风速优化预报系统
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-21 DOI: 10.1016/j.future.2024.107565
Jiyang Wang, Jifeng Che, Zhiwu Li, Jialu Gao, Linyue Zhang
Wind speed time series forecasting has been widely used in wind power generation. However, the nonlinear and non-stationary characteristics of wind speed make accurate wind speed forecasting a difficult task. In recent years, the rapid development of artificial intelligence and machine learning technology provides a new solution to the problem of wind speed forecasting. Combining the advance of artificial intelligence and data analysis strategy, this paper proposes a wind speed forecasting system based on multi-model fusion and integrated learning. Considering the differences in data observation and training principles of various algorithms and exploiting the advantages of each model, a wind speed forecasting system with multiple machine learning algorithms embedded in integrated learning is constructed, which includes three modules: data preprocessing, optimization and forecasting. The data preprocessing module can conduct quantitative analysis through input data decomposition and feature extraction, and the combination of multi-objective intelligent optimization algorithm and combined forecasting method can effectively forecast the wind speed time series. The validity of the algorithm is verified using the data of Shandong wind farm in China. The forecasting results show that compared with the traditional single model forecasting, the proposed integrated wind speed forecasting system based on the fusion of multiple models has higher forecasting accuracy.
风速时间序列预报已广泛应用于风力发电领域。然而,由于风速的非线性和非稳态特性,准确的风速预报成为一项艰巨的任务。近年来,人工智能和机器学习技术的快速发展为风速预报问题提供了新的解决方案。本文结合人工智能的发展和数据分析策略,提出了一种基于多模型融合和集成学习的风速预报系统。考虑到各种算法在数据观测和训练原理上的差异,并利用各模型的优势,构建了一个多机器学习算法嵌入集成学习的风速预报系统,包括数据预处理、优化和预报三个模块。数据预处理模块可通过输入数据分解和特征提取进行定量分析,多目标智能优化算法与组合预测方法的结合可有效预测风速时间序列。利用中国山东风电场的数据验证了算法的有效性。预报结果表明,与传统的单一模型预报相比,基于多模型融合的综合风速预报系统具有更高的预报精度。
{"title":"Hybrid wind speed optimization forecasting system based on linear and nonlinear deep neural network structure and data preprocessing fusion","authors":"Jiyang Wang,&nbsp;Jifeng Che,&nbsp;Zhiwu Li,&nbsp;Jialu Gao,&nbsp;Linyue Zhang","doi":"10.1016/j.future.2024.107565","DOIUrl":"10.1016/j.future.2024.107565","url":null,"abstract":"<div><div>Wind speed time series forecasting has been widely used in wind power generation. However, the nonlinear and non-stationary characteristics of wind speed make accurate wind speed forecasting a difficult task. In recent years, the rapid development of artificial intelligence and machine learning technology provides a new solution to the problem of wind speed forecasting. Combining the advance of artificial intelligence and data analysis strategy, this paper proposes a wind speed forecasting system based on multi-model fusion and integrated learning. Considering the differences in data observation and training principles of various algorithms and exploiting the advantages of each model, a wind speed forecasting system with multiple machine learning algorithms embedded in integrated learning is constructed, which includes three modules: data preprocessing, optimization and forecasting. The data preprocessing module can conduct quantitative analysis through input data decomposition and feature extraction, and the combination of multi-objective intelligent optimization algorithm and combined forecasting method can effectively forecast the wind speed time series. The validity of the algorithm is verified using the data of Shandong wind farm in China. The forecasting results show that compared with the traditional single model forecasting, the proposed integrated wind speed forecasting system based on the fusion of multiple models has higher forecasting accuracy.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107565"},"PeriodicalIF":6.2,"publicationDate":"2024-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142560682","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Serverless Computing for Next-generation Application Development 面向下一代应用程序开发的无服务器计算
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-21 DOI: 10.1016/j.future.2024.107573
Adel N. Toosi , Bahman Javadi , Alexandru Iosup , Evgenia Smirni , Schahram Dustdar
Serverless computing is a cloud computing model that abstracts server management, allowing developers to focus solely on writing code without concerns about the underlying infrastructure. This paradigm shift is transforming application development by reducing time to market, lowering costs, and enhancing scalability. In serverless computing, functions are event-driven and automatically scale in response to events such as data changes or user requests. Despite its advantages, serverless computing presents several research challenges, including managing state for ephemeral functions, mitigating cold start delays, optimizing function composition, debugging, efficient auto-scaling, resource management, and ensuring security and compliance. This special issue focused on addressing these challenges by promoting research on innovative solutions and exploring the potential of serverless computing in new application domains.
无服务器计算是一种云计算模式,它将服务器管理抽象化,让开发人员只需专注于编写代码,而无需关心底层基础设施。这种模式的转变正在通过缩短上市时间、降低成本和提高可扩展性来改变应用程序开发。在无服务器计算中,功能是由事件驱动的,可根据数据变化或用户请求等事件自动扩展。尽管无服务器计算具有诸多优势,但它也带来了一些研究挑战,包括管理短暂函数的状态、缓解冷启动延迟、优化函数组成、调试、高效自动扩展、资源管理以及确保安全性和合规性。本特刊主要通过促进创新解决方案的研究和探索无服务器计算在新应用领域的潜力来应对这些挑战。
{"title":"Serverless Computing for Next-generation Application Development","authors":"Adel N. Toosi ,&nbsp;Bahman Javadi ,&nbsp;Alexandru Iosup ,&nbsp;Evgenia Smirni ,&nbsp;Schahram Dustdar","doi":"10.1016/j.future.2024.107573","DOIUrl":"10.1016/j.future.2024.107573","url":null,"abstract":"<div><div>Serverless computing is a cloud computing model that abstracts server management, allowing developers to focus solely on writing code without concerns about the underlying infrastructure. This paradigm shift is transforming application development by reducing time to market, lowering costs, and enhancing scalability. In serverless computing, functions are event-driven and automatically scale in response to events such as data changes or user requests. Despite its advantages, serverless computing presents several research challenges, including managing state for ephemeral functions, mitigating cold start delays, optimizing function composition, debugging, efficient auto-scaling, resource management, and ensuring security and compliance. This special issue focused on addressing these challenges by promoting research on innovative solutions and exploring the potential of serverless computing in new application domains.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107573"},"PeriodicalIF":6.2,"publicationDate":"2024-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142651754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Securing the vetaverse: Web 3.0 for decentralized Digital Twin-enhanced vehicle–road safety 确保车辆安全:用于分散式数字孪生增强型车辆道路安全的 Web 3.0
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-21 DOI: 10.1016/j.future.2024.107555
Sadia Jabeen Siddiqi , Sana Saleh , Mian Ahmad Jan , Muhammad Tariq
The rapid evolution of vehicular communication technologies in recent times necessitates robust security measures and enhanced road safety protocols. Integrity of data shared between vehicles, their Digital Twins (DT) and road side units is at stake. Intrusion in these data can potentially lead to misinformation in Advanced Driving Assistance Systems (ADAS) causing serious consequences upon road safety. These include improper detection of drunk driving behaviors. In this domain, Web 3.0 emerges as the overarching approach that can transform security of vehicles and ensure road safety. This paper explores the potential of Web 3.0 and its key enabling technologies to establish a VEhicular meTAVERSE (Vetaverse) utilizing edge-based DTs of the vehicles to process their dynamics shared in real-time, and based on its deep learning models, predict whether the driving behavior is drunk or sober. This Deep Neural Network (DNN) performs these predictions with 96% accuracy. It secures all Vehicle-to-Digital Twin (V2DT) communications via Multichain - a horizontally scaled parallel blockchains platform that tamper proofs each bit of sensor data, and optimizes transaction validation time to leverage vetaverse security. Results reveal that this framework is accurate and computationally lightweight in comparison to existing state-of-the-art, and brings Web 3.0 to the crucial road safety use-case.
近来,车辆通信技术发展迅速,需要采取强有力的安全措施,并加强道路安全协议。车辆、其数字孪生系统 (DT) 和路侧装置之间共享数据的完整性岌岌可危。对这些数据的入侵有可能导致高级驾驶辅助系统(ADAS)中的错误信息,给道路安全带来严重后果。其中包括对酒后驾驶行为的不当检测。在这一领域,Web 3.0 成为改变车辆安全和确保道路安全的首要方法。本文探讨了 Web 3.0 及其关键使能技术的潜力,以利用基于边缘的车辆 DT 来建立一个 VEhicular meTAVERSE(Vetaverse),实时处理共享的车辆动态,并根据其深度学习模型来预测驾驶行为是醉酒还是清醒。该深度神经网络(DNN)的预测准确率高达 96%。它通过 Multichain(一个横向扩展的并行区块链平台)确保所有车辆到数字双胞胎(V2DT)通信的安全,Multichain 可对每个比特的传感器数据进行防篡改验证,并优化交易验证时间,以充分利用vetaverse安全性。研究结果表明,与现有的最先进技术相比,该框架准确度高、计算量小,并将 Web 3.0 带到了关键的道路安全应用案例中。
{"title":"Securing the vetaverse: Web 3.0 for decentralized Digital Twin-enhanced vehicle–road safety","authors":"Sadia Jabeen Siddiqi ,&nbsp;Sana Saleh ,&nbsp;Mian Ahmad Jan ,&nbsp;Muhammad Tariq","doi":"10.1016/j.future.2024.107555","DOIUrl":"10.1016/j.future.2024.107555","url":null,"abstract":"<div><div>The rapid evolution of vehicular communication technologies in recent times necessitates robust security measures and enhanced road safety protocols. Integrity of data shared between vehicles, their Digital Twins (DT) and road side units is at stake. Intrusion in these data can potentially lead to misinformation in Advanced Driving Assistance Systems (ADAS) causing serious consequences upon road safety. These include improper detection of drunk driving behaviors. In this domain, Web 3.0 emerges as the overarching approach that can transform security of vehicles and ensure road safety. This paper explores the potential of Web 3.0 and its key enabling technologies to establish a VEhicular meTAVERSE (Vetaverse) utilizing edge-based DTs of the vehicles to process their dynamics shared in real-time, and based on its deep learning models, predict whether the driving behavior is drunk or sober. This Deep Neural Network (DNN) performs these predictions with 96% accuracy. It secures all Vehicle-to-Digital Twin (V2DT) communications via Multichain - a horizontally scaled parallel blockchains platform that tamper proofs each bit of sensor data, and optimizes transaction validation time to leverage vetaverse security. Results reveal that this framework is accurate and computationally lightweight in comparison to existing state-of-the-art, and brings Web 3.0 to the crucial road safety use-case.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107555"},"PeriodicalIF":6.2,"publicationDate":"2024-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142526099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Efficient and provably secured puncturable attribute-based signature for Web 3.0 用于 Web 3.0 的高效且可证明安全的可标点属性签名
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-20 DOI: 10.1016/j.future.2024.107568
YueTong Wu , Hu Xiong , Fazlullah Khan , Salman Ijaz , Ryan Alturki , Abeer Aljohani
Web 3.0 is a grand design with intricate data interchange, implying the requirement of versatile network protocol to ensure its security. Attribute-based signature (ABS) allows a user, who is featured with a set of attributes, to sign messages under a predicate. The validity of the ABS signature demonstrates that this signature is generated by the user whose attributes satisfy the corresponding predicate, and thus flexibly achieves anonymous authentication. Similar to other digital signatures, the security of ABS is broken in case the private key of the user is leaked out. To address the threat brought by the key leakage, this paper proposes a puncturable attribute-based signature scheme that allows the private key generator to revoke the signing right associated with specific tags. This paper firstly elaborates the construction of the proposed ABS scheme with puncturable property, and then proves its security theoretically by reducing the involved security to the computational Diffie–Hellman assumption. This paper then experimentally shows that the suggested puncturable ABS scheme owns a more efficient storage cost and superior performance.
Web 3.0 是一个具有复杂数据交换的宏伟设计,这意味着需要多功能的网络协议来确保其安全性。基于属性的签名(ABS)允许具有一组属性的用户在一个谓词下对信息进行签名。ABS 签名的有效性表明,该签名是由属性满足相应谓词的用户生成的,从而灵活地实现了匿名认证。与其他数字签名类似,一旦用户的私钥泄露,ABS 的安全性就会被破坏。针对密钥泄露带来的威胁,本文提出了一种基于属性的可穿刺签名方案,允许私钥生成者撤销与特定标签相关的签名权。本文首先阐述了所提出的具有可穿刺属性的 ABS 方案的构造,然后通过将所涉及的安全性还原为计算 Diffie-Hellman 假设,从理论上证明了其安全性。然后,本文通过实验证明了所提出的可穿刺 ABS 方案具有更高效的存储成本和更优越的性能。
{"title":"Efficient and provably secured puncturable attribute-based signature for Web 3.0","authors":"YueTong Wu ,&nbsp;Hu Xiong ,&nbsp;Fazlullah Khan ,&nbsp;Salman Ijaz ,&nbsp;Ryan Alturki ,&nbsp;Abeer Aljohani","doi":"10.1016/j.future.2024.107568","DOIUrl":"10.1016/j.future.2024.107568","url":null,"abstract":"<div><div>Web 3.0 is a grand design with intricate data interchange, implying the requirement of versatile network protocol to ensure its security. Attribute-based signature (ABS) allows a user, who is featured with a set of attributes, to sign messages under a predicate. The validity of the ABS signature demonstrates that this signature is generated by the user whose attributes satisfy the corresponding predicate, and thus flexibly achieves anonymous authentication. Similar to other digital signatures, the security of ABS is broken in case the private key of the user is leaked out. To address the threat brought by the key leakage, this paper proposes a puncturable attribute-based signature scheme that allows the private key generator to revoke the signing right associated with specific tags. This paper firstly elaborates the construction of the proposed ABS scheme with puncturable property, and then proves its security theoretically by reducing the involved security to the computational Diffie–Hellman assumption. This paper then experimentally shows that the suggested puncturable ABS scheme owns a more efficient storage cost and superior performance.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107568"},"PeriodicalIF":6.2,"publicationDate":"2024-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142526100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Privacy-preserving and verifiable convolution neural network inference and training in cloud computing 云计算中的隐私保护和可验证卷积神经网络推理与训练
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-19 DOI: 10.1016/j.future.2024.107560
Wei Cao , Wenting Shen , Jing Qin , Hao Lin
With the rapid development of cloud computing, outsourcing massive data and complex deep learning model to cloud servers (CSs) has become a popular trend, which also brings some security problems. One is that the model stored in the CSs may be corrupted, leading to incorrect inference and training results. The other is that the privacy of outsourced data and model may be compromised. However, existing privacy-preserving and verifiable inference schemes suffer from low detection probability, high communication overhead and substantial computational time. To solve the above problems, we propose a privacy-preserving and verifiable scheme for convolutional neural network inference and training in cloud computing. In our scheme, the model owner generates the authenticators for model parameters before uploading the model to CSs. In the phase of model integrity verification, model owner and user can utilize these authenticators to check model integrity with high detection probability. Furthermore, we design a set of privacy-preserving protocols based on replicated secret sharing for both the inference and training phases, significantly reducing communication overhead and computational time. Through security analysis, we demonstrate that our scheme is secure. Experimental evaluations show that the proposed scheme outperforms existing schemes in privacy-preserving inference and model integrity verification.
随着云计算的快速发展,将海量数据和复杂的深度学习模型外包给云服务器(CSs)已成为一种流行趋势,同时也带来了一些安全问题。一是存储在 CS 中的模型可能会被破坏,导致错误的推理和训练结果。另一个问题是外包数据和模型的隐私可能被泄露。然而,现有的隐私保护和可验证推理方案存在检测概率低、通信开销大和计算时间长等问题。为了解决上述问题,我们提出了一种在云计算中进行卷积神经网络推理和训练的隐私保护和可验证方案。在我们的方案中,模型所有者在将模型上传到 CS 之前会生成模型参数的验证器。在模型完整性验证阶段,模型所有者和用户可以利用这些验证器以高检测概率检查模型的完整性。此外,我们还为推理和训练阶段设计了一套基于复制秘密共享的隐私保护协议,大大减少了通信开销和计算时间。通过安全分析,我们证明了我们的方案是安全的。实验评估表明,所提出的方案在隐私保护推理和模型完整性验证方面优于现有方案。
{"title":"Privacy-preserving and verifiable convolution neural network inference and training in cloud computing","authors":"Wei Cao ,&nbsp;Wenting Shen ,&nbsp;Jing Qin ,&nbsp;Hao Lin","doi":"10.1016/j.future.2024.107560","DOIUrl":"10.1016/j.future.2024.107560","url":null,"abstract":"<div><div>With the rapid development of cloud computing, outsourcing massive data and complex deep learning model to cloud servers (CSs) has become a popular trend, which also brings some security problems. One is that the model stored in the CSs may be corrupted, leading to incorrect inference and training results. The other is that the privacy of outsourced data and model may be compromised. However, existing privacy-preserving and verifiable inference schemes suffer from low detection probability, high communication overhead and substantial computational time. To solve the above problems, we propose a privacy-preserving and verifiable scheme for convolutional neural network inference and training in cloud computing. In our scheme, the model owner generates the authenticators for model parameters before uploading the model to CSs. In the phase of model integrity verification, model owner and user can utilize these authenticators to check model integrity with high detection probability. Furthermore, we design a set of privacy-preserving protocols based on replicated secret sharing for both the inference and training phases, significantly reducing communication overhead and computational time. Through security analysis, we demonstrate that our scheme is secure. Experimental evaluations show that the proposed scheme outperforms existing schemes in privacy-preserving inference and model integrity verification.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107560"},"PeriodicalIF":6.2,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142554220","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SeCTIS: A framework to Secure CTI Sharing SeCTIS:安全 CTI 共享框架
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-19 DOI: 10.1016/j.future.2024.107562
Dincy R. Arikkat , Mert Cihangiroglu , Mauro Conti , Rafidha Rehiman K.A. , Serena Nicolazzo , Antonino Nocera , Vinod P.
The rise of IT-dependent operations in modern organizations has heightened their vulnerability to cyberattacks. Organizations are inadvertently enlarging their vulnerability to cyber threats by integrating more interconnected devices into their operations, which makes these threats both more sophisticated and more common. Consequently, organizations have been compelled to seek innovative approaches to mitigate the menaces inherent in their infrastructure. In response, considerable research efforts have been directed towards creating effective solutions for sharing Cyber Threat Intelligence (CTI). Current information-sharing methods lack privacy safeguards, leaving organizations vulnerable to proprietary and confidential data leaks. To tackle this problem, we designed a novel framework called SeCTIS (Secure Cyber Threat Intelligence Sharing), integrating Swarm Learning and Blockchain technologies to enable businesses to collaborate, preserving the privacy of their CTI data. Moreover, our approach provides a way to assess the data and model quality and the trustworthiness of all the participants leveraging some validators through Zero Knowledge Proofs. Extensive experimentation has confirmed the accuracy and performance of our framework. Furthermore, our detailed attack model analyzes its resistance to attacks that could impact data and model quality.
现代组织对信息技术依赖性的增加,加剧了它们面对网络攻击的脆弱性。各组织在运营中集成了更多的互联设备,这使得这些威胁变得更加复杂和常见,从而无意中扩大了它们在网络威胁面前的脆弱性。因此,各组织不得不寻求创新方法来减轻其基础设施中固有的威胁。为此,大量的研究工作致力于为共享网络威胁情报(CTI)创建有效的解决方案。目前的信息共享方法缺乏隐私保护措施,使企业容易受到专有数据和机密数据泄露的影响。为了解决这个问题,我们设计了一个名为 SeCTIS(安全网络威胁情报共享)的新型框架,它集成了蜂群学习和区块链技术,使企业能够在保护 CTI 数据隐私的情况下开展协作。此外,我们的方法还提供了一种通过零知识证明(Zero Knowledge Proofs)利用一些验证器评估数据和模型质量以及所有参与者可信度的方法。广泛的实验证实了我们框架的准确性和性能。此外,我们详细的攻击模型分析了它对可能影响数据和模型质量的攻击的抵抗力。
{"title":"SeCTIS: A framework to Secure CTI Sharing","authors":"Dincy R. Arikkat ,&nbsp;Mert Cihangiroglu ,&nbsp;Mauro Conti ,&nbsp;Rafidha Rehiman K.A. ,&nbsp;Serena Nicolazzo ,&nbsp;Antonino Nocera ,&nbsp;Vinod P.","doi":"10.1016/j.future.2024.107562","DOIUrl":"10.1016/j.future.2024.107562","url":null,"abstract":"<div><div>The rise of IT-dependent operations in modern organizations has heightened their vulnerability to cyberattacks. Organizations are inadvertently enlarging their vulnerability to cyber threats by integrating more interconnected devices into their operations, which makes these threats both more sophisticated and more common. Consequently, organizations have been compelled to seek innovative approaches to mitigate the menaces inherent in their infrastructure. In response, considerable research efforts have been directed towards creating effective solutions for sharing Cyber Threat Intelligence (CTI). Current information-sharing methods lack privacy safeguards, leaving organizations vulnerable to proprietary and confidential data leaks. To tackle this problem, we designed a novel framework called SeCTIS (Secure Cyber Threat Intelligence Sharing), integrating Swarm Learning and Blockchain technologies to enable businesses to collaborate, preserving the privacy of their CTI data. Moreover, our approach provides a way to assess the data and model quality and the trustworthiness of all the participants leveraging some <em>validators</em> through Zero Knowledge Proofs. Extensive experimentation has confirmed the accuracy and performance of our framework. Furthermore, our detailed attack model analyzes its resistance to attacks that could impact data and model quality.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107562"},"PeriodicalIF":6.2,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142573266","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
ServlessSimPro: A comprehensive serverless simulation platform ServlessSimPro:全面的无服务器仿真平台
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-19 DOI: 10.1016/j.future.2024.107558
Han Cao , Jinquan Zhang , Long Chen , Siyuan Li , Guang Shi
Serverless computing represents an emerging paradigm within cloud computing, characterized by the fundamental concept of enabling developers to run applications without the need for concerns related to the management of underlying servers. Although there are several mature serverless computing platforms currently exist, there is limited availability of open-source simulation platforms that can accurately simulate the characteristics of serverless environments and provide a free and convenient tool for researchers to conduct investigations. Furthermore, existing simulation platforms do not provide comprehensive interfaces for scheduling strategies and do not provide the diverse monitoring metrics required by researchers. In response to this gap, we have developed the ServlessSimPro simulation platform. This platform offers the most comprehensive set of scheduling algorithms, diverse evaluation metrics, and extensive interfaces and parameters among all existing serverless simulators. The experimental results demonstrate that the scheduling algorithm of the simulator can effectively reduce latency, enhance resource utilization, and decrease energy consumption.
无服务器计算代表了云计算中的一种新兴范式,其基本概念是让开发人员能够在无需关注底层服务器管理的情况下运行应用程序。虽然目前有几个成熟的无服务器计算平台,但能准确模拟无服务器环境特征并为研究人员提供免费、方便的研究工具的开源模拟平台却很有限。此外,现有的仿真平台没有为调度策略提供全面的接口,也没有提供研究人员所需的各种监控指标。针对这一不足,我们开发了 ServlessSimPro 仿真平台。在现有的无服务器仿真器中,该平台提供了最全面的调度算法、多样化的评估指标以及广泛的接口和参数。实验结果表明,该模拟器的调度算法能有效减少延迟、提高资源利用率并降低能耗。
{"title":"ServlessSimPro: A comprehensive serverless simulation platform","authors":"Han Cao ,&nbsp;Jinquan Zhang ,&nbsp;Long Chen ,&nbsp;Siyuan Li ,&nbsp;Guang Shi","doi":"10.1016/j.future.2024.107558","DOIUrl":"10.1016/j.future.2024.107558","url":null,"abstract":"<div><div>Serverless computing represents an emerging paradigm within cloud computing, characterized by the fundamental concept of enabling developers to run applications without the need for concerns related to the management of underlying servers. Although there are several mature serverless computing platforms currently exist, there is limited availability of open-source simulation platforms that can accurately simulate the characteristics of serverless environments and provide a free and convenient tool for researchers to conduct investigations. Furthermore, existing simulation platforms do not provide comprehensive interfaces for scheduling strategies and do not provide the diverse monitoring metrics required by researchers. In response to this gap, we have developed the ServlessSimPro simulation platform. This platform offers the most comprehensive set of scheduling algorithms, diverse evaluation metrics, and extensive interfaces and parameters among all existing serverless simulators. The experimental results demonstrate that the scheduling algorithm of the simulator can effectively reduce latency, enhance resource utilization, and decrease energy consumption.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"163 ","pages":"Article 107558"},"PeriodicalIF":6.2,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142533496","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Toward data efficient anomaly detection in heterogeneous edge–cloud environments using clustered federated learning 利用聚类联合学习实现异构边缘云环境中的数据高效异常检测
IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS Pub Date : 2024-10-19 DOI: 10.1016/j.future.2024.107559
Zongpu Wei, Jinsong Wang, Zening Zhao, Kai Shi
Anomaly detection in edge–cloud scenarios stands as a critical means to ensure the security of network environment. Federated learning (FL)-based anomaly detection combines multiple data sources and ensures data privacy, making it a promising distributed detection method. However, FL-based anomaly detection system is usually affected by data heterogeneity and data bias, resulting in the inefficiency of data used for FL and the decline of detection performance. We propose an iterative federated clustering ensemble algorithm named IFCEA, in which we (1) establish a committee on the devices, and select the optimal participation for each device based on the evaluations of committee; (2) filter the clusters based on committee results, and exclude the biased clusters; (3) design an aggregation weight that reflects the degree of local distribution balance; (4) present a novel cluster initialization method, OneBiPartition, which adapts to the number of clusters and commences clustering federated task efficiently. IFCEA enhances the data quality used in FL-based anomaly detection system from two perspectives: device selection and participation weights, effectively addressing the issues of data heterogeneity and data bias faced during the FL training phase. Extensive experimental results on five network traffic datasets (the UNSW-NB15, CIC-IDS2017, CIC-IDS2018, CIC-DDoS2019 and BCCC-DDoS2024 datasets) demonstrate that our proposed framework outperforms in terms of detection metrics and convergence performance.
边缘云场景中的异常检测是确保网络环境安全的重要手段。基于联合学习(FL)的异常检测结合了多个数据源并确保了数据隐私,是一种很有前途的分布式检测方法。然而,基于联合学习的异常检测系统通常会受到数据异质性和数据偏差的影响,导致联合学习所使用的数据效率低下,检测性能下降。我们提出了一种名为 IFCEA 的迭代联合聚类集合算法,其中包括:(1)在设备上建立一个委员会,并根据委员会的评估结果为每个设备选择最优的参与方式;(2)根据委员会的结果过滤聚类,并排除有偏差的聚类;(3)设计一个反映局部分布平衡程度的聚合权重;(4)提出一种新颖的聚类初始化方法 OneBiPartition,该方法能适应聚类的数量并高效地开始聚类联合任务。IFCEA 从设备选择和参与权重两个方面提高了基于 FL 的异常检测系统的数据质量,有效解决了 FL 训练阶段面临的数据异构和数据偏差问题。在五个网络流量数据集(UNSW-NB15、CIC-IDS2017、CIC-IDS2018、CIC-DDoS2019 和 BCCC-DDoS2024 数据集)上进行的大量实验结果表明,我们提出的框架在检测指标和收敛性能方面都表现优异。
{"title":"Toward data efficient anomaly detection in heterogeneous edge–cloud environments using clustered federated learning","authors":"Zongpu Wei,&nbsp;Jinsong Wang,&nbsp;Zening Zhao,&nbsp;Kai Shi","doi":"10.1016/j.future.2024.107559","DOIUrl":"10.1016/j.future.2024.107559","url":null,"abstract":"<div><div>Anomaly detection in edge–cloud scenarios stands as a critical means to ensure the security of network environment. Federated learning (FL)-based anomaly detection combines multiple data sources and ensures data privacy, making it a promising distributed detection method. However, FL-based anomaly detection system is usually affected by data heterogeneity and data bias, resulting in the inefficiency of data used for FL and the decline of detection performance. We propose an iterative federated clustering ensemble algorithm named IFCEA, in which we (1) establish a committee on the devices, and select the optimal participation for each device based on the evaluations of committee; (2) filter the clusters based on committee results, and exclude the biased clusters; (3) design an aggregation weight that reflects the degree of local distribution balance; (4) present a novel cluster initialization method, OneBiPartition, which adapts to the number of clusters and commences clustering federated task efficiently. IFCEA enhances the data quality used in FL-based anomaly detection system from two perspectives: device selection and participation weights, effectively addressing the issues of data heterogeneity and data bias faced during the FL training phase. Extensive experimental results on five network traffic datasets (the UNSW-NB15, CIC-IDS2017, CIC-IDS2018, CIC-DDoS2019 and BCCC-DDoS2024 datasets) demonstrate that our proposed framework outperforms in terms of detection metrics and convergence performance.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107559"},"PeriodicalIF":6.2,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142560683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Future Generation Computer Systems-The International Journal of Escience
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1