{"title":"利用极大化负例与最小球面中心之间的距离改进支持向量域描述","authors":"Mohamed el Boujnouni, M. Jedra","doi":"10.32604/csse.2018.33.409","DOIUrl":null,"url":null,"abstract":"Support Vector Domain Description (SVDD) is an effective kernel-based method used for data description. It was motivated by the success of Support Vector Machine (SVM) and thus has inherited many of its attractive properties. It has been extensively used for novelty detection and has been applied successfully to a variety of classification problems. This classifier aims to find a sphere with minimal volume including the majority of examples that belong to the class of interest (positive) and excluding the most of examples that are either outliers or belong to other classes (negatives). In this paper we propose a new approach to improve the classification accuracy of SVDD. This objective will be achieved by exploiting the existence of negative examples in the training step, without increasing the computational time and memory resources required to solve the quadratic programming problem of that classifier. Simulation results on two challenging artificial problems, namely chessboard and two spirals, and four benchmark datasets have successfully validated the effectiveness of the proposed method.","PeriodicalId":119237,"journal":{"name":"Commun. Stat. Simul. Comput.","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Improving Support Vector Domain Description by Maximizing the Distance Between Negative Examples and The Minimal Sphere Center's\",\"authors\":\"Mohamed el Boujnouni, M. Jedra\",\"doi\":\"10.32604/csse.2018.33.409\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Support Vector Domain Description (SVDD) is an effective kernel-based method used for data description. It was motivated by the success of Support Vector Machine (SVM) and thus has inherited many of its attractive properties. It has been extensively used for novelty detection and has been applied successfully to a variety of classification problems. This classifier aims to find a sphere with minimal volume including the majority of examples that belong to the class of interest (positive) and excluding the most of examples that are either outliers or belong to other classes (negatives). In this paper we propose a new approach to improve the classification accuracy of SVDD. This objective will be achieved by exploiting the existence of negative examples in the training step, without increasing the computational time and memory resources required to solve the quadratic programming problem of that classifier. Simulation results on two challenging artificial problems, namely chessboard and two spirals, and four benchmark datasets have successfully validated the effectiveness of the proposed method.\",\"PeriodicalId\":119237,\"journal\":{\"name\":\"Commun. Stat. Simul. Comput.\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Commun. Stat. Simul. Comput.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.32604/csse.2018.33.409\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Commun. Stat. Simul. Comput.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32604/csse.2018.33.409","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improving Support Vector Domain Description by Maximizing the Distance Between Negative Examples and The Minimal Sphere Center's
Support Vector Domain Description (SVDD) is an effective kernel-based method used for data description. It was motivated by the success of Support Vector Machine (SVM) and thus has inherited many of its attractive properties. It has been extensively used for novelty detection and has been applied successfully to a variety of classification problems. This classifier aims to find a sphere with minimal volume including the majority of examples that belong to the class of interest (positive) and excluding the most of examples that are either outliers or belong to other classes (negatives). In this paper we propose a new approach to improve the classification accuracy of SVDD. This objective will be achieved by exploiting the existence of negative examples in the training step, without increasing the computational time and memory resources required to solve the quadratic programming problem of that classifier. Simulation results on two challenging artificial problems, namely chessboard and two spirals, and four benchmark datasets have successfully validated the effectiveness of the proposed method.