Since the number of server providing the facilities for users is usually more than one, remote user authentication schemes used for multi-server architectures, rather than single server circumstance, is considered. In 2009, Hsiang and Shih proposed an “Improvement of the secure dynamic ID based remote user authentication scheme for multi-server environment” that uses dynamic ID instead of static ID to achieve user’s anonymity for verifying the legitimacy of a remote login user. In this paper, we analyze their protocol and demonstrate that it cannot achieve true anonymity and has some other weaknesses. We further propose the improvements to avoid those security problems. Besides user privacy, the key features of our scheme are including no verification table, freely chosen password, mutual authentication, low computation and communication cost, single registration, session key agreement, and being secure against the related attacks.
{"title":"A Novel Approach to Dynamic ID-Based Remote User Authentication Scheme for Multi-server Environment","authors":"Min-Hua Shao, Y. Chin","doi":"10.1109/NSS.2010.95","DOIUrl":"https://doi.org/10.1109/NSS.2010.95","url":null,"abstract":"Since the number of server providing the facilities for users is usually more than one, remote user authentication schemes used for multi-server architectures, rather than single server circumstance, is considered. In 2009, Hsiang and Shih proposed an “Improvement of the secure dynamic ID based remote user authentication scheme for multi-server environment” that uses dynamic ID instead of static ID to achieve user’s anonymity for verifying the legitimacy of a remote login user. In this paper, we analyze their protocol and demonstrate that it cannot achieve true anonymity and has some other weaknesses. We further propose the improvements to avoid those security problems. Besides user privacy, the key features of our scheme are including no verification table, freely chosen password, mutual authentication, low computation and communication cost, single registration, session key agreement, and being secure against the related attacks.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131414130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper extends the dynamic copula model for bivariate option pricing in Goorbergh et al (2004) to price credit spread options. We use GARCH-t model to describe the marginal distributions for corporate bonds and treasury, and combine them with dynamic Gaussian copula to obtain the joint distribution. As an application we use this model to price credit spread options written on American corporate bonds. Unlike other approaches for credit spread option pricing, this model is based on the two components of the spread rather than the spread itself, and the dependence structure is time-varying.
本文将Goorbergh et al(2004)二元期权定价的动态联结模型推广到信用价差期权定价。本文采用GARCH-t模型描述公司债券和国债的边际分布,并将其与动态高斯联结公式相结合,得到联合分布。作为一个应用,我们使用这个模型来为美国公司债券的信用价差期权定价。与其他信用价差期权定价方法不同,该模型基于价差的两个组成部分,而不是价差本身,并且依赖结构是时变的。
{"title":"Credit Spread Option Pricing by Dynamic Copulas","authors":"Ping Li, Guan-Ying Huang","doi":"10.1109/NSS.2010.93","DOIUrl":"https://doi.org/10.1109/NSS.2010.93","url":null,"abstract":"This paper extends the dynamic copula model for bivariate option pricing in Goorbergh et al (2004) to price credit spread options. We use GARCH-t model to describe the marginal distributions for corporate bonds and treasury, and combine them with dynamic Gaussian copula to obtain the joint distribution. As an application we use this model to price credit spread options written on American corporate bonds. Unlike other approaches for credit spread option pricing, this model is based on the two components of the spread rather than the spread itself, and the dependence structure is time-varying.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133717014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-09-01DOI: 10.4108/trans.sis.2013.01-03.e5
Min Li
The recent usage control model (UCON) is a foundation for next-generation access control models with distinguishing properties of decision continuity and attribute mutability. Constraints in UCON are one of the most important components that have involved in the principle motivations of usage analysis and design. The importance of constraints associated with authorizations, obligations, and conditions in UCON has been recognized but modeling these constraints has not been received much attention. In this paper we use a de facto constraints specification language in software engineering to analyze the constraints in UCON model. We show how to represent constraints with object constraint language (OCL) and give out a formalized specification of UCON model which is built from basic constraints, such as authorization predicates, obligation actions and condition requirements. Further, we show the flexibility and expressive capability of this specified UCON model with extensive examples.
{"title":"Specifying Usage Control Model with Object Constraint Language","authors":"Min Li","doi":"10.4108/trans.sis.2013.01-03.e5","DOIUrl":"https://doi.org/10.4108/trans.sis.2013.01-03.e5","url":null,"abstract":"The recent usage control model (UCON) is a foundation for next-generation access control models with distinguishing properties of decision continuity and attribute mutability. Constraints in UCON are one of the most important components that have involved in the principle motivations of usage analysis and design. The importance of constraints associated with authorizations, obligations, and conditions in UCON has been recognized but modeling these constraints has not been received much attention. In this paper we use a de facto constraints specification language in software engineering to analyze the constraints in UCON model. We show how to represent constraints with object constraint language (OCL) and give out a formalized specification of UCON model which is built from basic constraints, such as authorization predicates, obligation actions and condition requirements. Further, we show the flexibility and expressive capability of this specified UCON model with extensive examples.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125615288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
String matching is a basic problem of string operation, and privacy-preserving string matching, as a special case of secure multi-party computation, has broad applications in auction, bidding and some other commercial areas. In this paper, some protocols are proposed to solve this private matching problem, the security and correctness are analyzed respectively, and the actual efficiency is tested by experiment. A protocol is also designed based on the BMH algorithm which is more efficient and conceals more private information.
{"title":"Privacy-Preserving Protocols for String Matching","authors":"Yonglong Luo, Lei Shi, Caiyun Zhang, Ji Zhang","doi":"10.1109/NSS.2010.24","DOIUrl":"https://doi.org/10.1109/NSS.2010.24","url":null,"abstract":"String matching is a basic problem of string operation, and privacy-preserving string matching, as a special case of secure multi-party computation, has broad applications in auction, bidding and some other commercial areas. In this paper, some protocols are proposed to solve this private matching problem, the security and correctness are analyzed respectively, and the actual efficiency is tested by experiment. A protocol is also designed based on the BMH algorithm which is more efficient and conceals more private information.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114263382","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, a new orientation-based method which operates in two-step, called in Dynamic Processing (DP) system, is proposed for the determination of fingerprint reference point. For the two-step operation, it uses different orientation features at different scales: block-based orientation certainty and pixel-based segmented direction map. A block-based orientation certainty is used to describe the change of a block curvature of a fingerprint, which is determined by two eigenvalues of the gradient covariance matrix and a pixel-based segmented direction map is used to find intersections emerging from the directional transition. The DP system is built through their cross-references to determine the position of reference point. While the proposed system has the pixel-based precision by virtue of using a pixel-based segmented direction map, it reduces many possible fault symptoms by virtue of using a block-based orientation certainty. The proposed technique shows better performance in accuracy rate than other previous techniques. The performance of the proposed one is verified through simulations and its analysis.
{"title":"Fingerprint Reference Point Determination Based on Orientation Features","authors":"S. Xie, Sook Yoon, Hui Gong, J. Shin, D. Park","doi":"10.1109/NSS.2010.75","DOIUrl":"https://doi.org/10.1109/NSS.2010.75","url":null,"abstract":"In this paper, a new orientation-based method which operates in two-step, called in Dynamic Processing (DP) system, is proposed for the determination of fingerprint reference point. For the two-step operation, it uses different orientation features at different scales: block-based orientation certainty and pixel-based segmented direction map. A block-based orientation certainty is used to describe the change of a block curvature of a fingerprint, which is determined by two eigenvalues of the gradient covariance matrix and a pixel-based segmented direction map is used to find intersections emerging from the directional transition. The DP system is built through their cross-references to determine the position of reference point. While the proposed system has the pixel-based precision by virtue of using a pixel-based segmented direction map, it reduces many possible fault symptoms by virtue of using a block-based orientation certainty. The proposed technique shows better performance in accuracy rate than other previous techniques. The performance of the proposed one is verified through simulations and its analysis.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114907304","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Distributed Denial of Service (DDoS) attack is one of main threats to Internet security. Due to the spatio-temporal properties of the attack, it is possible to detect the attack at its early stage. In this paper, we propose a novel method of DDoS threat assessment based on network vulnerability analysis. Both the multi-phase character in the temporal dimension and the impacts in the spatial dimension are concerned in our method. We use three metrics to assess threat, namely the ratio of progress, botnet size, and bots distribution. Experimental results show that our method is sensitive to the changes of attack states, and is easy to be implemented in an early warning system because of its simplicity.
{"title":"A Novel Threat Assessment Method for DDoS Early Warning Using Network Vulnerability Analysis","authors":"Qiang Liu, Jianping Yin, Zhiping Cai, M. Zhu","doi":"10.1109/NSS.2010.52","DOIUrl":"https://doi.org/10.1109/NSS.2010.52","url":null,"abstract":"Distributed Denial of Service (DDoS) attack is one of main threats to Internet security. Due to the spatio-temporal properties of the attack, it is possible to detect the attack at its early stage. In this paper, we propose a novel method of DDoS threat assessment based on network vulnerability analysis. Both the multi-phase character in the temporal dimension and the impacts in the spatial dimension are concerned in our method. We use three metrics to assess threat, namely the ratio of progress, botnet size, and bots distribution. Experimental results show that our method is sensitive to the changes of attack states, and is easy to be implemented in an early warning system because of its simplicity.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122787802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Providing access to relevant confidential information during an emergency increases the efficiency of emergency response operations. Existing approaches rely on a centralized on-line authority to regulate access to emergency applications and data. Unfortunately, it cannot be guaranteed that the central authority is available during an incident. Additionally, the central authority must be trusted to manage access to restricted data of each participating organization. Using a physical security token for decentralized trust decisions to gain transient access to emergency applications and data mitigates these problems. Therefore, we propose the Attestation Verification Device (AVD) combined with a proper administration infrastructure and suitable protocols. Our solution allows to distribute the administration responsibility in a way which empowers all participating emergency services and organizations to manage their own applications and confidential data.
{"title":"Securing Emergency Response Operations Using Distributed Trust Decisions","authors":"Peter Danner, Daniel M. Hein, Stefan Kraxberger","doi":"10.1109/NSS.2010.34","DOIUrl":"https://doi.org/10.1109/NSS.2010.34","url":null,"abstract":"Providing access to relevant confidential information during an emergency increases the efficiency of emergency response operations. Existing approaches rely on a centralized on-line authority to regulate access to emergency applications and data. Unfortunately, it cannot be guaranteed that the central authority is available during an incident. Additionally, the central authority must be trusted to manage access to restricted data of each participating organization. Using a physical security token for decentralized trust decisions to gain transient access to emergency applications and data mitigates these problems. Therefore, we propose the Attestation Verification Device (AVD) combined with a proper administration infrastructure and suitable protocols. Our solution allows to distribute the administration responsibility in a way which empowers all participating emergency services and organizations to manage their own applications and confidential data.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132764082","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Stepping stone attacks are often used by network intruders to hide their identities. The Round Trip Times (RTT) between the send packets and corresponding echo packets for the connection chains of stepping stones are critical for detecting such attacks. In this paper, we propose a novel real-time RTT getting algorithm for stepping stones which is based on the estimation of the current RTT value. Our experiments show that it is far more precise than the previous real-time RTT getting algorithms. We also present the probability analysis which shows that our algorithm has a high matching rate and a high accurate rate.
{"title":"Getting the Real-Time Precise Round-Trip Time for Stepping Stone Detection","authors":"Ping Li, Wanlei Zhou, Yini Wang","doi":"10.1109/NSS.2010.36","DOIUrl":"https://doi.org/10.1109/NSS.2010.36","url":null,"abstract":"Stepping stone attacks are often used by network intruders to hide their identities. The Round Trip Times (RTT) between the send packets and corresponding echo packets for the connection chains of stepping stones are critical for detecting such attacks. In this paper, we propose a novel real-time RTT getting algorithm for stepping stones which is based on the estimation of the current RTT value. Our experiments show that it is far more precise than the previous real-time RTT getting algorithms. We also present the probability analysis which shows that our algorithm has a high matching rate and a high accurate rate.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130050871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Feature selection is an important research problem in machine learning and data mining applications. This paper proposes a hybrid wrapper and filter feature selection algorithm by introducing the filter’s feature ranking score in the wrapper stage to speed up the search process for wrapper and thereby finding a more compact feature subset. The approach hybridizes a Mutual Information (MI) based Maximum Relevance (MR) filter ranking heuristic with an Artificial Neural Network (ANN) based wrapper approach where Artificial Neural Network Input Gain Measurement Approximation (ANNIGMA) has been combined with MR (MR-ANNIGMA) to guide the search process in the wrapper. The novelty of our approach is that we use hybrid of wrapper and filter methods that combines filter’s ranking score with the wrapper-heuristic’s score to take advantages of both filter and wrapper heuristics. Performance of the proposed MR-ANNIGMA has been verified using bench mark data sets and compared to both independent filter and wrapper based approaches. Experimental results show that MR-ANNIGMA achieves more compact feature sets and higher accuracies than both filter and wrapper approaches alone.
{"title":"Hybrid Wrapper-Filter Approaches for Input Feature Selection Using Maximum Relevance and Artificial Neural Network Input Gain Measurement Approximation (ANNIGMA)","authors":"Md. Shamsul Huda, J. Yearwood, A. Stranieri","doi":"10.1109/NSS.2010.7","DOIUrl":"https://doi.org/10.1109/NSS.2010.7","url":null,"abstract":"Feature selection is an important research problem in machine learning and data mining applications. This paper proposes a hybrid wrapper and filter feature selection algorithm by introducing the filter’s feature ranking score in the wrapper stage to speed up the search process for wrapper and thereby finding a more compact feature subset. The approach hybridizes a Mutual Information (MI) based Maximum Relevance (MR) filter ranking heuristic with an Artificial Neural Network (ANN) based wrapper approach where Artificial Neural Network Input Gain Measurement Approximation (ANNIGMA) has been combined with MR (MR-ANNIGMA) to guide the search process in the wrapper. The novelty of our approach is that we use hybrid of wrapper and filter methods that combines filter’s ranking score with the wrapper-heuristic’s score to take advantages of both filter and wrapper heuristics. Performance of the proposed MR-ANNIGMA has been verified using bench mark data sets and compared to both independent filter and wrapper based approaches. Experimental results show that MR-ANNIGMA achieves more compact feature sets and higher accuracies than both filter and wrapper approaches alone.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115781797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As network capacity has increased over the past decade, individuals and organisations have found it increasingly appealling to make use of remote services in the form of service-oriented architectures and cloud computing services. Data processed by remote services, however, is no longer under the direct control of the individual or organisation that provided the data, leaving data owners at risk of data theft or misuse. This paper describes a model by which data owners can control the distribution and use of their data throughout a dynamic coalition of service providers using digital rights management technology. Our model allows a data owner to establish the trustworthiness of every member of a coalition employed to process data, and to communicate a machine-enforceable usage policy to every such member.
{"title":"A Rights Management Approach to Securing Data Distribution in Coalitions","authors":"Farzad Salim, N. Sheppard, R. Safavi-Naini","doi":"10.1109/NSS.2010.94","DOIUrl":"https://doi.org/10.1109/NSS.2010.94","url":null,"abstract":"As network capacity has increased over the past decade, individuals and organisations have found it increasingly appealling to make use of remote services in the form of service-oriented architectures and cloud computing services. Data processed by remote services, however, is no longer under the direct control of the individual or organisation that provided the data, leaving data owners at risk of data theft or misuse. This paper describes a model by which data owners can control the distribution and use of their data throughout a dynamic coalition of service providers using digital rights management technology. Our model allows a data owner to establish the trustworthiness of every member of a coalition employed to process data, and to communicate a machine-enforceable usage policy to every such member.","PeriodicalId":127173,"journal":{"name":"2010 Fourth International Conference on Network and System Security","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116180656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}