Guoqiang Deng , Min Tang , Zengyi Huang , Yuhao Zhang , Yuxing Xi
{"title":"基于良好分离结构的机密外包支持向量机学习","authors":"Guoqiang Deng , Min Tang , Zengyi Huang , Yuhao Zhang , Yuxing Xi","doi":"10.1016/j.future.2024.107564","DOIUrl":null,"url":null,"abstract":"<div><div>Support Vector Machine (SVM) has revolutionized various domains and achieved remarkable successes. This progress relies on subtle algorithms and more on large training samples. However, the massive data collection introduces security concerns. To facilitate secure integration of data efficiently for building an accurate SVM classifier, we present a non-interactive protocol for privacy-preserving SVM, named <em>NPSVMT</em>. Specifically, we define a new well-separated structure for computing gradients that can decouple the fusion matter between user data and model parameters, allowing data providers to outsource the collaborative learning task to the cloud. As a result, <em>NPSVMT</em> is capable of removing the multiple communications and eliminating the straggler’s effect (waiting for the last), thereby going beyond those developed with interactive methods, e.g., federated learning. To further decrease the data traffic, we introduce a high-efficient coding method to compress and parse training data. In addition, unlike outsourced schemes based on homomorphic encryption or secret sharing, <em>NPSVMT</em> exploits functional encryption to maintain the data confidentiality, achieving dropout-tolerant secure aggregation. The implementations verify that <em>NPSVMT</em> is faster by orders of magnitude than the existing privacy-preserving SVM schemes on benchmark datasets.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"164 ","pages":"Article 107564"},"PeriodicalIF":6.2000,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Confidential outsourced support vector machine learning based on well-separated structure\",\"authors\":\"Guoqiang Deng , Min Tang , Zengyi Huang , Yuhao Zhang , Yuxing Xi\",\"doi\":\"10.1016/j.future.2024.107564\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Support Vector Machine (SVM) has revolutionized various domains and achieved remarkable successes. This progress relies on subtle algorithms and more on large training samples. However, the massive data collection introduces security concerns. To facilitate secure integration of data efficiently for building an accurate SVM classifier, we present a non-interactive protocol for privacy-preserving SVM, named <em>NPSVMT</em>. Specifically, we define a new well-separated structure for computing gradients that can decouple the fusion matter between user data and model parameters, allowing data providers to outsource the collaborative learning task to the cloud. As a result, <em>NPSVMT</em> is capable of removing the multiple communications and eliminating the straggler’s effect (waiting for the last), thereby going beyond those developed with interactive methods, e.g., federated learning. To further decrease the data traffic, we introduce a high-efficient coding method to compress and parse training data. In addition, unlike outsourced schemes based on homomorphic encryption or secret sharing, <em>NPSVMT</em> exploits functional encryption to maintain the data confidentiality, achieving dropout-tolerant secure aggregation. The implementations verify that <em>NPSVMT</em> is faster by orders of magnitude than the existing privacy-preserving SVM schemes on benchmark datasets.</div></div>\",\"PeriodicalId\":55132,\"journal\":{\"name\":\"Future Generation Computer Systems-The International Journal of Escience\",\"volume\":\"164 \",\"pages\":\"Article 107564\"},\"PeriodicalIF\":6.2000,\"publicationDate\":\"2024-10-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Future Generation Computer Systems-The International Journal of Escience\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167739X24005284\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X24005284","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
Confidential outsourced support vector machine learning based on well-separated structure
Support Vector Machine (SVM) has revolutionized various domains and achieved remarkable successes. This progress relies on subtle algorithms and more on large training samples. However, the massive data collection introduces security concerns. To facilitate secure integration of data efficiently for building an accurate SVM classifier, we present a non-interactive protocol for privacy-preserving SVM, named NPSVMT. Specifically, we define a new well-separated structure for computing gradients that can decouple the fusion matter between user data and model parameters, allowing data providers to outsource the collaborative learning task to the cloud. As a result, NPSVMT is capable of removing the multiple communications and eliminating the straggler’s effect (waiting for the last), thereby going beyond those developed with interactive methods, e.g., federated learning. To further decrease the data traffic, we introduce a high-efficient coding method to compress and parse training data. In addition, unlike outsourced schemes based on homomorphic encryption or secret sharing, NPSVMT exploits functional encryption to maintain the data confidentiality, achieving dropout-tolerant secure aggregation. The implementations verify that NPSVMT is faster by orders of magnitude than the existing privacy-preserving SVM schemes on benchmark datasets.
期刊介绍:
Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications.
Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration.
Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.