Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.219
Wubin Pan, Guang Cheng, Yongning Tang
SSL/TLS protocol is widely used for secure web applications (i.e., HTTPS). Classifying encrypted SSL/TLS based applications is an important but challenging task for network management. Traditional traffic classification methods are incapable of accomplishing this task. Several recently proposed approaches that focused on discriminating defining fingerprints among various SSL/TLS applications have also shown various limitations. In this paper, we design a Weighted ENsemble Classifier (WENC) to tackle these limitations. WENC studies the characteristics of various sub-flows during the HTTPS handshake process and the following data transmission period. To increase the fingerprint recognizability, we propose to establish a second-order Markov chain model with a fingerprint variable jointly considering the packet length and the message type during the process of HTTPS handshake. Furthermore, the series of the packet lengths of application data is modeled as HMM with optimal emission probability. Finally, a weighted ensemble strategy is devised to accommodate the advantages of several approaches as a unified one. Experimental results show that the classification accuracy of the proposed method reaches 90%, with an 11% improvement on average comparing to the state-of-the-art methods.
{"title":"WENC: HTTPS Encrypted Traffic Classification Using Weighted Ensemble Learning and Markov Chain","authors":"Wubin Pan, Guang Cheng, Yongning Tang","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.219","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.219","url":null,"abstract":"SSL/TLS protocol is widely used for secure web applications (i.e., HTTPS). Classifying encrypted SSL/TLS based applications is an important but challenging task for network management. Traditional traffic classification methods are incapable of accomplishing this task. Several recently proposed approaches that focused on discriminating defining fingerprints among various SSL/TLS applications have also shown various limitations. In this paper, we design a Weighted ENsemble Classifier (WENC) to tackle these limitations. WENC studies the characteristics of various sub-flows during the HTTPS handshake process and the following data transmission period. To increase the fingerprint recognizability, we propose to establish a second-order Markov chain model with a fingerprint variable jointly considering the packet length and the message type during the process of HTTPS handshake. Furthermore, the series of the packet lengths of application data is modeled as HMM with optimal emission probability. Finally, a weighted ensemble strategy is devised to accommodate the advantages of several approaches as a unified one. Experimental results show that the classification accuracy of the proposed method reaches 90%, with an 11% improvement on average comparing to the state-of-the-art methods.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121041624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.361
Robin Ankele, A. Simpson
Novel trusted hardware extensions such as Intel's SGX enable user-space applications to be protected against potentially malicious operating systems. Moreover, SGX supports strong attestation guarantees, whereby remote parties can be convinced of the trustworthy nature of the executing user-space application. These developments are particularly interesting in the context of large-scale privacy-preserving data mining. In a typical data mining scenario, mutually distrustful parties have to share potentially sensitive data with an untrusted server, which in turn computes a data mining operation and returns the result to the clients. Generally, such collaborative tasks are referred to as secure multi-party computation (MPC) problems. Privacy-preserving distributed data mining has the additional requirement of (output) privacy preservation (which typically is achieved by the addition of random noise to the function output); additionally, it limits the general purpose functionality to distinct data mining operations. To solve these problems in a scalable and efficient manner, the concept of a Trustworthy Remote Entity (TRE) was recently introduced. We report upon the performance of a SGX-based TRE and compare our results to popular secure MPC frameworks. Due to limitations of the MPC frameworks, we benchmarked only simple operations (and argue that more complex data mining operations can be established by composing several basic operations). We consider both a two-party setting (where we iterate over the number of operations) and a multi-party setting (where we iterate over the number of participants).
{"title":"On the Performance of a Trustworthy Remote Entity in Comparison to Secure Multi-party Computation","authors":"Robin Ankele, A. Simpson","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.361","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.361","url":null,"abstract":"Novel trusted hardware extensions such as Intel's SGX enable user-space applications to be protected against potentially malicious operating systems. Moreover, SGX supports strong attestation guarantees, whereby remote parties can be convinced of the trustworthy nature of the executing user-space application. These developments are particularly interesting in the context of large-scale privacy-preserving data mining. In a typical data mining scenario, mutually distrustful parties have to share potentially sensitive data with an untrusted server, which in turn computes a data mining operation and returns the result to the clients. Generally, such collaborative tasks are referred to as secure multi-party computation (MPC) problems. Privacy-preserving distributed data mining has the additional requirement of (output) privacy preservation (which typically is achieved by the addition of random noise to the function output); additionally, it limits the general purpose functionality to distinct data mining operations. To solve these problems in a scalable and efficient manner, the concept of a Trustworthy Remote Entity (TRE) was recently introduced. We report upon the performance of a SGX-based TRE and compare our results to popular secure MPC frameworks. Due to limitations of the MPC frameworks, we benchmarked only simple operations (and argue that more complex data mining operations can be established by composing several basic operations). We consider both a two-party setting (where we iterate over the number of operations) and a multi-party setting (where we iterate over the number of participants).","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114109986","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.311
Long Cheng, Kai Huang, Gang Chen, Biao Hu, A. Knoll
Nowadays, many embedded systems consist of a mix of control applications and soft real-time tasks. This paper studies how to ensure the worst-case quality of control for control applications under disturbances while providing maximal resource to soft real-time tasks. To solve this problem, we propose a mixed-criticality control system model in which the tasks can switch between two operating modes, LO and HI, according to controlled plant states. In HI mode, the worst-case qualities of control to plants are guaranteed, while in LO mode, system resources are balanced between two classes of tasks. We compare our approach with other two approaches in the literature. Case study results demonstrate the effectiveness of our system model.
{"title":"Mixed-Criticality Control System with Performance and Robustness Guarantees","authors":"Long Cheng, Kai Huang, Gang Chen, Biao Hu, A. Knoll","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.311","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.311","url":null,"abstract":"Nowadays, many embedded systems consist of a mix of control applications and soft real-time tasks. This paper studies how to ensure the worst-case quality of control for control applications under disturbances while providing maximal resource to soft real-time tasks. To solve this problem, we propose a mixed-criticality control system model in which the tasks can switch between two operating modes, LO and HI, according to controlled plant states. In HI mode, the worst-case qualities of control to plants are guaranteed, while in LO mode, system resources are balanced between two classes of tasks. We compare our approach with other two approaches in the literature. Case study results demonstrate the effectiveness of our system model.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132761230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.307
Dominique Fleurbaaij, M. Scanlon, Nhien-An Le-Khac
In recent years the use of digital communication has increased. This also increased the chance to find privileged data in the digital evidence. Privileged data is protected by law from viewing by anyone other than the client. It is up to the digital investigator to handle this privileged data properly without being able to view the contents. Procedures on handling this information are available, but do not provide any practical information nor is it known how effective filtering is. The objective of this paper is to describe the handling of privileged data in the current digital forensic tools and the creation of a script within the digital forensic tool Nuix. The script automates the handling of privileged data to minimize the exposure of the contents to the digital investigator. The script also utilizes technology within Nuix that extends the automated search of identical privileged document to relate files based on their contents. A comparison of the 'traditional' ways of filtering within the digital forensic tools and the script written in Nuix showed that digital forensic tools are still limited when used on privileged data. The script manages to increase the effectiveness as direct result of the use of relations based on file content.
{"title":"Privileged Data Within Digital Evidence","authors":"Dominique Fleurbaaij, M. Scanlon, Nhien-An Le-Khac","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.307","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.307","url":null,"abstract":"In recent years the use of digital communication has increased. This also increased the chance to find privileged data in the digital evidence. Privileged data is protected by law from viewing by anyone other than the client. It is up to the digital investigator to handle this privileged data properly without being able to view the contents. Procedures on handling this information are available, but do not provide any practical information nor is it known how effective filtering is. The objective of this paper is to describe the handling of privileged data in the current digital forensic tools and the creation of a script within the digital forensic tool Nuix. The script automates the handling of privileged data to minimize the exposure of the contents to the digital investigator. The script also utilizes technology within Nuix that extends the automated search of identical privileged document to relate files based on their contents. A comparison of the 'traditional' ways of filtering within the digital forensic tools and the script written in Nuix showed that digital forensic tools are still limited when used on privileged data. The script manages to increase the effectiveness as direct result of the use of relations based on file content.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128303956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.251
Lyes Touati
The Internet of Things (IoT) is a new paradigm in which every-day objects are interconnected between each other and to the Internet. This paradigm is receiving much attention of the scientific community and it is applied in many fields. In some applications, it is useful to prove that a number of objects are simultaneously present in a group. For example, an individual might want to authorize NFC payment with his mobile only if k of his devices are present to ensure that he is the right person. This principle is known as Grouping-Proofs. However, existing Grouping-Proofs schemes are mostly designed for RFID systems and don’t fulfill the IoT characteristics. In this paper, we propose a Threshold Grouping-Proofs for IoT applications. Our scheme uses the Key-Policy Attribute-Based Encryption (KP-ABE) protocol to encrypt a message so that it can be decrypted only if at least k objects are simultaneously present in the same location. A security analysis and performance evaluation is conducted to show the effectiveness of our proposal solution.
{"title":"Grouping-Proofs Based Access Control Using KP-ABE for IoT Applications","authors":"Lyes Touati","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.251","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.251","url":null,"abstract":"The Internet of Things (IoT) is a new paradigm in which every-day objects are interconnected between each other and to the Internet. This paradigm is receiving much attention of the scientific community and it is applied in many fields. In some applications, it is useful to prove that a number of objects are simultaneously present in a group. For example, an individual might want to authorize NFC payment with his mobile only if k of his devices are present to ensure that he is the right person. This principle is known as Grouping-Proofs. However, existing Grouping-Proofs schemes are mostly designed for RFID systems and don’t fulfill the IoT characteristics. In this paper, we propose a Threshold Grouping-Proofs for IoT applications. Our scheme uses the Key-Policy Attribute-Based Encryption (KP-ABE) protocol to encrypt a message so that it can be decrypted only if at least k objects are simultaneously present in the same location. A security analysis and performance evaluation is conducted to show the effectiveness of our proposal solution.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128530993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.223
C. Shao, Huiyun Li
Single event transients (SETs) have seriously deteriorated the reliability Integrated circuits (ICs), especially for those in mission- or security-critical applications. Detecting and locating SETs can be useful for fault analysis and future enhancement. Traditional SET detecting methods usually require special sensors embedded into the circuits, or radiation scanning with fine resolutions over the surface for inspection. In this paper, we establish the relationship between sparsity of SETs and the overall faults. Then we develop the method of compressed sensing to detect the location of SET in ICs, without any embed sensors or imaging procession. A case study on a cryptographic IC by logic simulation is demonstrated. It verifies that the proposed method has two main advantages: 1) the SET sensitive area can be accurately identified. 2) The sampling rate is reduced by 70%, therefore the test efficiency is largely enhanced with negligible hardware overhead.
{"title":"Detection of Single Event Transients Based on Compressed Sensing","authors":"C. Shao, Huiyun Li","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.223","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.223","url":null,"abstract":"Single event transients (SETs) have seriously deteriorated the reliability Integrated circuits (ICs), especially for those in mission- or security-critical applications. Detecting and locating SETs can be useful for fault analysis and future enhancement. Traditional SET detecting methods usually require special sensors embedded into the circuits, or radiation scanning with fine resolutions over the surface for inspection. In this paper, we establish the relationship between sparsity of SETs and the overall faults. Then we develop the method of compressed sensing to detect the location of SET in ICs, without any embed sensors or imaging procession. A case study on a cryptographic IC by logic simulation is demonstrated. It verifies that the proposed method has two main advantages: 1) the SET sensitive area can be accurately identified. 2) The sampling rate is reduced by 70%, therefore the test efficiency is largely enhanced with negligible hardware overhead.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127381143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.365
Chathuranga Rathnayaka, Aruna Jamdagni
Static analysis in malware analysis has been complex due to string searching methods. Forensic investigation of the physical memory or memory forensics provides a comprehensive analysis of malware, checking traces of malware in malware dumps that have been created while running in an operating system. In this study, we propose efficient and robust framework to analyse complex malwares by integrating both static analysis techniques and memory forensic techniques. The proposed framework has evaluated two hundred real malware samples and achieved a 90% detection rate. These results have been compared and verified with the results obtained from www.virustotal.com, which is online malware analysis tool. Additionally, we have identified the sources of many malware samples.
{"title":"An Efficient Approach for Advanced Malware Analysis Using Memory Forensic Technique","authors":"Chathuranga Rathnayaka, Aruna Jamdagni","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.365","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.365","url":null,"abstract":"Static analysis in malware analysis has been complex due to string searching methods. Forensic investigation of the physical memory or memory forensics provides a comprehensive analysis of malware, checking traces of malware in malware dumps that have been created while running in an operating system. In this study, we propose efficient and robust framework to analyse complex malwares by integrating both static analysis techniques and memory forensic techniques. The proposed framework has evaluated two hundred real malware samples and achieved a 90% detection rate. These results have been compared and verified with the results obtained from www.virustotal.com, which is online malware analysis tool. Additionally, we have identified the sources of many malware samples.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132936940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.296
J. McDonald, Ramya Manikyam, W. Glisson, T. Andel, Y. Gu
Digital forensic investigators today are faced with numerous problems when recovering footprints of criminal activity that involve the use of computer systems. Investigators need the ability to recover evidence in a forensically sound manner, even when criminals actively work to alter the integrity, veracity, and provenance of data, applications and software that are used to support illicit activities. In many ways, operating systems (OS) can be strengthened from a technological viewpoint to support verifiable, accurate, and consistent recovery of system data when needed for forensic collection efforts. In this paper, we extend the ideas for forensic-friendly OS design by proposing the use of a practical form of computing on encrypted data (CED) and computing with encrypted functions (CEF) which builds upon prior work on component encryption (in circuits) and white-box cryptography (in software). We conduct experiments on sample programs to provide analysis of the approach based on security and efficiency, illustrating how component encryption can strengthen key OS functions and improve tamper-resistance to anti-forensic activities. We analyze the tradeoff space for use of the algorithm in a holistic approach that provides additional security and comparable properties to fully homomorphic encryption (FHE).
{"title":"Enhanced Operating System Protection to Support Digital Forensic Investigations","authors":"J. McDonald, Ramya Manikyam, W. Glisson, T. Andel, Y. Gu","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.296","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.296","url":null,"abstract":"Digital forensic investigators today are faced with numerous problems when recovering footprints of criminal activity that involve the use of computer systems. Investigators need the ability to recover evidence in a forensically sound manner, even when criminals actively work to alter the integrity, veracity, and provenance of data, applications and software that are used to support illicit activities. In many ways, operating systems (OS) can be strengthened from a technological viewpoint to support verifiable, accurate, and consistent recovery of system data when needed for forensic collection efforts. In this paper, we extend the ideas for forensic-friendly OS design by proposing the use of a practical form of computing on encrypted data (CED) and computing with encrypted functions (CEF) which builds upon prior work on component encryption (in circuits) and white-box cryptography (in software). We conduct experiments on sample programs to provide analysis of the approach based on security and efficiency, illustrating how component encryption can strengthen key OS functions and improve tamper-resistance to anti-forensic activities. We analyze the tradeoff space for use of the algorithm in a holistic approach that provides additional security and comparable properties to fully homomorphic encryption (FHE).","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129473193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.336
Leyla Roohi, Vanessa Teague
We investigate the use of the SPDZ multiparty computation platform to facilitate secure cloud storage of graphstructured data such as telecommunications metadata. We report on an implementation of a simple scheme for answering adjacency, nearest-neighbour and second-hop queries. Our solution hides the data, the query and the answer from the cloud servers unless they all collude to recover them.
{"title":"Privacy-Preserving Queries over Secret-Shared Graph-Structured Data","authors":"Leyla Roohi, Vanessa Teague","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.336","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.336","url":null,"abstract":"We investigate the use of the SPDZ multiparty computation platform to facilitate secure cloud storage of graphstructured data such as telecommunications metadata. We report on an implementation of a simple scheme for answering adjacency, nearest-neighbour and second-hop queries. Our solution hides the data, the query and the answer from the cloud servers unless they all collude to recover them.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128568494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-08-01DOI: 10.1109/Trustcom/BigDataSE/ICESS.2017.338
Lei Wang, Shoufeng Cao, Lin Wan, Fengyu Wang
Due to the fact that web services spread around the world, new threats are increasing. The misuse intrusion detection system is not able to provide enough protection for the security of Web Services, because it only detects formerly known attacks and cannot detect new unknown attacks. Web logs contain a lot of valuable information that is useful in preventing intrusion. In this paper, we present a new web anomaly detection method which uses FCERMining(Frequent Closed Episode Rules Mining) algorithm to analyze web logs and detect new unknown web attacks. The novel FCERMining algorithm parallelly mines the frequent closed episode rules on Spark, which handles massive data rapidly. Meanwhile, it reduces a part of rules which are redundant for anomaly detection to improve the matching efficiency. Then we also propose a grouping scheme to improve the parallel efficiency of FCERMining algorithm. Finally, we use SQLMAP and WebCruiser to simulate some web attacks, our method has a detection rate of 96.67% and a false alarm rate of 3.33% for detecting abnormal users. Our experimental results also demonstrate the reduction of redundant rules improve the matching efficiency. Furthermore, we compare the efficiency of our FCERMining algorithm with other pattern mining algorithms, experimental results indicate that our FCERMining algorithm outperforms other pattern mining algorithms.
{"title":"Web Anomaly Detection Based on Frequent Closed Episode Rules","authors":"Lei Wang, Shoufeng Cao, Lin Wan, Fengyu Wang","doi":"10.1109/Trustcom/BigDataSE/ICESS.2017.338","DOIUrl":"https://doi.org/10.1109/Trustcom/BigDataSE/ICESS.2017.338","url":null,"abstract":"Due to the fact that web services spread around the world, new threats are increasing. The misuse intrusion detection system is not able to provide enough protection for the security of Web Services, because it only detects formerly known attacks and cannot detect new unknown attacks. Web logs contain a lot of valuable information that is useful in preventing intrusion. In this paper, we present a new web anomaly detection method which uses FCERMining(Frequent Closed Episode Rules Mining) algorithm to analyze web logs and detect new unknown web attacks. The novel FCERMining algorithm parallelly mines the frequent closed episode rules on Spark, which handles massive data rapidly. Meanwhile, it reduces a part of rules which are redundant for anomaly detection to improve the matching efficiency. Then we also propose a grouping scheme to improve the parallel efficiency of FCERMining algorithm. Finally, we use SQLMAP and WebCruiser to simulate some web attacks, our method has a detection rate of 96.67% and a false alarm rate of 3.33% for detecting abnormal users. Our experimental results also demonstrate the reduction of redundant rules improve the matching efficiency. Furthermore, we compare the efficiency of our FCERMining algorithm with other pattern mining algorithms, experimental results indicate that our FCERMining algorithm outperforms other pattern mining algorithms.","PeriodicalId":170253,"journal":{"name":"2017 IEEE Trustcom/BigDataSE/ICESS","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128995378","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}