Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815715
E. Rao, J. Remer
The Cargo Supply Chain Integrity Technology (CSIT) Research, Development, Test and Evaluation project, jointly managed by the Transportation Security Administration (TSA) and the Department of Homeland Security's (DHS) Science and Technology (S&T) Directorate develops standards and certifies systems to mitigate threats posed by the potential introduction of improvised explosive devices into cargo carried by passenger aircraft. At the direction of Congress, Public Law 110-53, “Implementing the Recommendations of the 9/11 Commission Act of 2007”, mandates that all 100% of all cargo shipped via passenger aircraft be screened.
{"title":"Tamper evident tape integrity analyzer","authors":"E. Rao, J. Remer","doi":"10.1109/CCST.2016.7815715","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815715","url":null,"abstract":"The Cargo Supply Chain Integrity Technology (CSIT) Research, Development, Test and Evaluation project, jointly managed by the Transportation Security Administration (TSA) and the Department of Homeland Security's (DHS) Science and Technology (S&T) Directorate develops standards and certifies systems to mitigate threats posed by the potential introduction of improvised explosive devices into cargo carried by passenger aircraft. At the direction of Congress, Public Law 110-53, “Implementing the Recommendations of the 9/11 Commission Act of 2007”, mandates that all 100% of all cargo shipped via passenger aircraft be screened.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"11 1","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87000356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815723
A. Meecham, T. Acker
The vulnerability of military installations and critical infrastructure sites from underwater threats is now well accepted and, in order to combat these security weaknesses, there has been growing interest in - and adoption of - sonar technology. Greater availability of Autonomous/Unmanned Underwater Vehicles (A/UUVs) to both adversary nations and terrorists/saboteurs is also a cause of increasing concern. The small size and low acoustic target strength/signature of these vehicles presents significant challenges for sonar systems. The well-known challenges of the underwater environment, particularly in a harbor or port setting, can lead to a Nuisance Alarm Rate (NAR) that is higher than that of traditional security sensors (e.g. CCTV). This, in turn, can lead to a lack of confidence from end users and a possibility that `real' alerts are incorrectly dism issed. In the past this has been addressed by increasing the capability of individual sensors, leading to ever-increasing sensor complexity, however, the relationship between sensor performance and complexity/cost is highly non-linear. Even with the most complex and capable sensors, the fundamental limit to performance is often limited by acoustics, not sensor capability. In this paper we describe an alternative approach to reducing NAR and improving detection of difficult targets (e.g. UUVs), through intelligent combination and fusion of outputs from multiple sensors and data/signal processing algorithms. We describe the statistical basis for this approach, as well as techniques, methodologies and architectures for implementation. We describe the approach taken in our prototype algorithms/system, as well as quantitative and qualitative results from testing in a real-world environment. These results show a significant reduction in NAR and increase in classiflcation/alert range. Finally, we describe current focus areas for algorithmic and system development in both the short and medium term, as well as future extensions of these techniques to more classes of sensors, so that more challenging problems can be addressed.
{"title":"Underwater threat detection and tracking using multiple sensors and advanced processing","authors":"A. Meecham, T. Acker","doi":"10.1109/CCST.2016.7815723","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815723","url":null,"abstract":"The vulnerability of military installations and critical infrastructure sites from underwater threats is now well accepted and, in order to combat these security weaknesses, there has been growing interest in - and adoption of - sonar technology. Greater availability of Autonomous/Unmanned Underwater Vehicles (A/UUVs) to both adversary nations and terrorists/saboteurs is also a cause of increasing concern. The small size and low acoustic target strength/signature of these vehicles presents significant challenges for sonar systems. The well-known challenges of the underwater environment, particularly in a harbor or port setting, can lead to a Nuisance Alarm Rate (NAR) that is higher than that of traditional security sensors (e.g. CCTV). This, in turn, can lead to a lack of confidence from end users and a possibility that `real' alerts are incorrectly dism issed. In the past this has been addressed by increasing the capability of individual sensors, leading to ever-increasing sensor complexity, however, the relationship between sensor performance and complexity/cost is highly non-linear. Even with the most complex and capable sensors, the fundamental limit to performance is often limited by acoustics, not sensor capability. In this paper we describe an alternative approach to reducing NAR and improving detection of difficult targets (e.g. UUVs), through intelligent combination and fusion of outputs from multiple sensors and data/signal processing algorithms. We describe the statistical basis for this approach, as well as techniques, methodologies and architectures for implementation. We describe the approach taken in our prototype algorithms/system, as well as quantitative and qualitative results from testing in a real-world environment. These results show a significant reduction in NAR and increase in classiflcation/alert range. Finally, we describe current focus areas for algorithmic and system development in both the short and medium term, as well as future extensions of these techniques to more classes of sensors, so that more challenging problems can be addressed.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"C-35 1","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84452086","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815699
Michael Bourg, Pramod Govindan
The biometric encryption system is a significant addition in the areas of privacy, security and convenience among its users. The intent of this research is to propose an RSA based biometric encryption system which can be realized on field programmable gate arrays (FPGAs) using hardware-software co-design methods. Due to the high number of hackers that stand to profit from sub-par security methods, the proposed design will serve as a high level of security. This implementation can be applied in many areas of life including but not limited to password replacement, building and equipment access, and payroll and timekeeping procedures.
{"title":"RSA based biometric encryption system using FPGA for increased security","authors":"Michael Bourg, Pramod Govindan","doi":"10.1109/CCST.2016.7815699","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815699","url":null,"abstract":"The biometric encryption system is a significant addition in the areas of privacy, security and convenience among its users. The intent of this research is to propose an RSA based biometric encryption system which can be realized on field programmable gate arrays (FPGAs) using hardware-software co-design methods. Due to the high number of hackers that stand to profit from sub-par security methods, the proposed design will serve as a high level of security. This implementation can be applied in many areas of life including but not limited to password replacement, building and equipment access, and payroll and timekeeping procedures.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"20 1","pages":"1-4"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84905886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815720
Tomas Smetka, I. Homoliak, P. Hanáček
The aim of the paper is to show different point of view on the problem of cryptanalysis of symmetric encryption algorithms. Our dissimilar approach, compared to the existing methods, lies in the use of the power of evolutionary principles which are in our cryptanalytic system applied with leveraging of the genetic programming (GP) in order to perform known plaintext attack (KPA). Our expected result is to find a program (i.e. function) that models the behavior of a symmetric encryption algorithm DES instantiated by specific key. If such a program would exist, then it could be possible to decipher new messages that have been encrypted by unknown secret key. The GP is employed as the basis of this work. GP is an evolutionary algorithm-based methodology inspired by biological evolution which is capable of creating computer programs solving a corresponding problem. The symbolic regression (SR) method is employed as the application of GP in practical problem. The SR method builds functions from predefined set of terminal blocks in the process of the GP evolution; and these functions approximate a list of input value pairs. The evolution of GP is controlled by a fitness function which evaluates the goal of a corresponding problem. The Hamming distance, a difference between a current individual value and a reference one, is chosen as the fitness function for our cryptanalysis problem. The results of our experiments did not confirmed initial expectation. The number of encryption rounds did not influence the quality of the best individual, however, its quality was influenced by the cardinality of a training set. The elimination of the initial and final permutations had no influence on the quality of the results in the process of evolution. These results showed that our KPA GP solution is not capable of revealing internal structure of the DES algorithm's behavior.
{"title":"On the application of symbolic regression and genetic programming for cryptanalysis of symmetric encryption algorithm","authors":"Tomas Smetka, I. Homoliak, P. Hanáček","doi":"10.1109/CCST.2016.7815720","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815720","url":null,"abstract":"The aim of the paper is to show different point of view on the problem of cryptanalysis of symmetric encryption algorithms. Our dissimilar approach, compared to the existing methods, lies in the use of the power of evolutionary principles which are in our cryptanalytic system applied with leveraging of the genetic programming (GP) in order to perform known plaintext attack (KPA). Our expected result is to find a program (i.e. function) that models the behavior of a symmetric encryption algorithm DES instantiated by specific key. If such a program would exist, then it could be possible to decipher new messages that have been encrypted by unknown secret key. The GP is employed as the basis of this work. GP is an evolutionary algorithm-based methodology inspired by biological evolution which is capable of creating computer programs solving a corresponding problem. The symbolic regression (SR) method is employed as the application of GP in practical problem. The SR method builds functions from predefined set of terminal blocks in the process of the GP evolution; and these functions approximate a list of input value pairs. The evolution of GP is controlled by a fitness function which evaluates the goal of a corresponding problem. The Hamming distance, a difference between a current individual value and a reference one, is chosen as the fitness function for our cryptanalysis problem. The results of our experiments did not confirmed initial expectation. The number of encryption rounds did not influence the quality of the best individual, however, its quality was influenced by the cardinality of a training set. The elimination of the initial and final permutations had no influence on the quality of the results in the process of evolution. These results showed that our KPA GP solution is not capable of revealing internal structure of the DES algorithm's behavior.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"69 1","pages":"1-8"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81804012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815703
A. C. D'Iddio, C. Schunck, F. Arcieri, M. Talamo
Conformance checking is a crucial challenge for modern inter-organizational business processes when critical security, privacy and workflow constraints must be satisfied to ensure the reliability of multi-party business procedures. Many of these constraints can be expressed in terms of causal dependencies, and verifying such dependencies can be fundamental to determine the correctness of transactions. But often the information required to check causal dependencies is incomplete, coarse or imprecise due to several reasons, like low maturity of event logs, corrupted data, local timestamping and privacy requirements of each organization. In previous work we presented a solution to address these issues based on abstraction, over-approximation and under-approximation of the causal dependencies, to model unavailable data and maintain the ability to prove correctness or to find anomalies in inter-organizational transactions. In that paper we made some assumptions about the structure of business processes which are reasonable for security sensitive business processes but cannot be applied in all circumstances. In this paper we relax the assumptions made in that previous work and we discuss how this affects the applicability of the theorems. We find that while some notions need to be redefined, in most cases the same techniques, especially the ones based on underapproximation, remain applicable to investigate the correctness of business processes and to find anomalies for post-mortem investigation or online operational support.
{"title":"Extending abstraction-refinement methods for compliance checking of inter-organizational business processes with incomplete information","authors":"A. C. D'Iddio, C. Schunck, F. Arcieri, M. Talamo","doi":"10.1109/CCST.2016.7815703","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815703","url":null,"abstract":"Conformance checking is a crucial challenge for modern inter-organizational business processes when critical security, privacy and workflow constraints must be satisfied to ensure the reliability of multi-party business procedures. Many of these constraints can be expressed in terms of causal dependencies, and verifying such dependencies can be fundamental to determine the correctness of transactions. But often the information required to check causal dependencies is incomplete, coarse or imprecise due to several reasons, like low maturity of event logs, corrupted data, local timestamping and privacy requirements of each organization. In previous work we presented a solution to address these issues based on abstraction, over-approximation and under-approximation of the causal dependencies, to model unavailable data and maintain the ability to prove correctness or to find anomalies in inter-organizational transactions. In that paper we made some assumptions about the structure of business processes which are reasonable for security sensitive business processes but cannot be applied in all circumstances. In this paper we relax the assumptions made in that previous work and we discuss how this affects the applicability of the theorems. We find that while some notions need to be redefined, in most cases the same techniques, especially the ones based on underapproximation, remain applicable to investigate the correctness of business processes and to find anomalies for post-mortem investigation or online operational support.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"11 1","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83802333","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815683
F. Garzia, L. Sant'Andrea
The purpose of this paper is to illustrate the Internet of Everything based integrated security system designed for World War One commemorative Museum of Fogliano Redipuglia in Italy, capable of ensuring visitors security, cultural heritage préservâtion/protection and great usability for visitors, with particular reference to visitors with disabilities. Genetic Algorithms (GAs) have been used to design the integrated security system, in particular for wired network to ensure a reduction of final costs and a high level of reliability and resilience of the system itself, keeping, into consideration, the typical vincula and restrictions of already existing Museums. The proposed system, together with the GAs based optimization technique, thanks to its flexibility, can be used in any kind of museum or any kind of cultural site by means of a proper adaption.
{"title":"The Internet of Everything based integrated security system of the World War One Commemorative Museum of Fogliano Redipuglia in Italy","authors":"F. Garzia, L. Sant'Andrea","doi":"10.1109/CCST.2016.7815683","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815683","url":null,"abstract":"The purpose of this paper is to illustrate the Internet of Everything based integrated security system designed for World War One commemorative Museum of Fogliano Redipuglia in Italy, capable of ensuring visitors security, cultural heritage préservâtion/protection and great usability for visitors, with particular reference to visitors with disabilities. Genetic Algorithms (GAs) have been used to design the integrated security system, in particular for wired network to ensure a reduction of final costs and a high level of reliability and resilience of the system itself, keeping, into consideration, the typical vincula and restrictions of already existing Museums. The proposed system, together with the GAs based optimization technique, thanks to its flexibility, can be used in any kind of museum or any kind of cultural site by means of a proper adaption.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"7 1","pages":"1-8"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82500203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/ccst.2016.7815688
Patrick K. Kuiper, S. Kolitz, V. Tarokh
The United States (US) Army has over 66,000 soldiers engaged in contingency operations across the world. Current budgetary constraints and an uncertain global security environment require these operations to be executed as efficiently as possible. Base camps are the secured areas where soldiers live when deployed to contingency operations. Base camps impose a significant financial and tactical burden during contingency operations and sub-optimal soldier quality of life decisions have significantly contributed to costs. Quality of life (QOL) refers to the non-security and non-mission related services that directly sustain the mission effectiveness of soldiers. Current US Army base camp tactics, techniques, and procedures (TTPs) do not sufficiently specify QOL services, and more detailed doctrine should be developed to support combat units executing contingency operations. In this investigation we employ quantitative methods to select decisions that improve QOL and inform doctrine. We leverage a QOL function and resource consumption data developed by US Army Natick Soldier Research, Development and Engineering Center's (Natick Labs) to build a model that improves QOL under the constraints of four fundamental resources: fuel, water, waste water, and solid waste. We employ a mixed integer linear program modeling approach and execute sensitivity analysis to evaluate the strength of our results. Our final model is formulated as a chance constraint optimization to address the uncertainty associated with resource availability in contingency operations. Our results provide QOL decisions that reduce resource consumption while maintaining an equivalent QOL level when compared to current TTPs. The model provides quantitative rigor, informing decision makers of specific base camp design principles for the development of doctrine.
美国陆军有超过66,000名士兵在世界各地参与应急行动。目前的预算限制和不确定的全球安全环境要求尽可能有效地执行这些行动。基地营地是士兵被部署到应急行动时居住的安全区域。在应急行动期间,基地营造成了重大的财政和战术负担,士兵生活质量欠佳的决定大大增加了成本。生活质量(Quality of life, QOL)是指直接维持士兵执行任务效能的与安全、任务无关的服务。目前的美国陆军大本营战术、技术和程序(TTPs)没有充分说明QOL服务,应该制定更详细的理论来支持作战单位执行应急行动。在这项调查中,我们采用定量方法来选择决策,提高生活质量和通知学说。我们利用美国陆军纳蒂克士兵研究、发展和工程中心(纳蒂克实验室)开发的生活质量函数和资源消耗数据,建立了一个模型,在燃料、水、废水和固体废物四种基本资源的约束下提高生活质量。我们采用混合整数线性规划建模方法,并执行敏感性分析来评估我们的结果的强度。我们的最终模型是一个机会约束优化,以解决应急操作中与资源可用性相关的不确定性。我们的结果提供的QOL决策减少了资源消耗,同时与当前的ttp相比保持了等效的QOL水平。该模型提供了定量的严谨性,为理论发展提供了具体大本营设计原则的信息。
{"title":"Base camp quality of life standardization and improvement","authors":"Patrick K. Kuiper, S. Kolitz, V. Tarokh","doi":"10.1109/ccst.2016.7815688","DOIUrl":"https://doi.org/10.1109/ccst.2016.7815688","url":null,"abstract":"The United States (US) Army has over 66,000 soldiers engaged in contingency operations across the world. Current budgetary constraints and an uncertain global security environment require these operations to be executed as efficiently as possible. Base camps are the secured areas where soldiers live when deployed to contingency operations. Base camps impose a significant financial and tactical burden during contingency operations and sub-optimal soldier quality of life decisions have significantly contributed to costs. Quality of life (QOL) refers to the non-security and non-mission related services that directly sustain the mission effectiveness of soldiers. Current US Army base camp tactics, techniques, and procedures (TTPs) do not sufficiently specify QOL services, and more detailed doctrine should be developed to support combat units executing contingency operations. In this investigation we employ quantitative methods to select decisions that improve QOL and inform doctrine. We leverage a QOL function and resource consumption data developed by US Army Natick Soldier Research, Development and Engineering Center's (Natick Labs) to build a model that improves QOL under the constraints of four fundamental resources: fuel, water, waste water, and solid waste. We employ a mixed integer linear program modeling approach and execute sensitivity analysis to evaluate the strength of our results. Our final model is formulated as a chance constraint optimization to address the uncertainty associated with resource availability in contingency operations. Our results provide QOL decisions that reduce resource consumption while maintaining an equivalent QOL level when compared to current TTPs. The model provides quantitative rigor, informing decision makers of specific base camp design principles for the development of doctrine.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"47 1","pages":"1-8"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82618900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815681
D. Morocho, J. Hernandez-Ortega, A. Morales, Julian Fierrez, J. Ortega-Garcia
This work explores the human ability to recognize the authenticity of signatures. We use crowdsourcing to analyze the different factors affecting the performance of humans without Forensic Document Examiner experience. We present different experiments according to different scenarios in which laymen, people without Forensic Document Examiner experience, provide similarity measures related with the perceived authenticity of a given signature. The human responses are used to analyze the performance of humans according to each of the scenarios and main factors. The experiments comprise 240 signatures from BiosecurlD public database and responses from more than 400 people. The results shows the difficulties associated to these tasks, with special attention to the false acceptance of forgeries with rates ranging from 50% to 75%. The results suggest that human recognition abilities in this scenario are strongly dependent on the characteristics considered and the signature at hand. Finally the combination of human ratings clearly outperfoms the individual performance and and a state-of-the-art automatic signature verification system.
{"title":"On the evaluation of human ratings for signature recognition","authors":"D. Morocho, J. Hernandez-Ortega, A. Morales, Julian Fierrez, J. Ortega-Garcia","doi":"10.1109/CCST.2016.7815681","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815681","url":null,"abstract":"This work explores the human ability to recognize the authenticity of signatures. We use crowdsourcing to analyze the different factors affecting the performance of humans without Forensic Document Examiner experience. We present different experiments according to different scenarios in which laymen, people without Forensic Document Examiner experience, provide similarity measures related with the perceived authenticity of a given signature. The human responses are used to analyze the performance of humans according to each of the scenarios and main factors. The experiments comprise 240 signatures from BiosecurlD public database and responses from more than 400 people. The results shows the difficulties associated to these tasks, with special attention to the false acceptance of forgeries with rates ranging from 50% to 75%. The results suggest that human recognition abilities in this scenario are strongly dependent on the characteristics considered and the signature at hand. Finally the combination of human ratings clearly outperfoms the individual performance and and a state-of-the-art automatic signature verification system.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"1 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89762001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815705
R. Loui, Lucinda Caughey
In response to increasing threats of malicious activity and data loss on servers, we propose a different and practical strategy for access control modeled after flight plans for pilots, which mixes existing role-based, object-based, and intention-based access models; it supports much finer grained, real-time, sequence-oriented anomaly detection. Users are required to declare their intended “flight path” in advance, a sketch of resource use: this may vary in detail, but could include database tables, file system directories, byte and bandwidth limits, use of encryption and archive creation, command sets, connection time, number and origin of connections, and ports. Sequence information provides especially strong constraint, even if it incomplete. We find an important place for active, on-line human sampling of flight plans, as well as pre-authorization for non-standard paths, and alerts for deviation from path. We also find a place for improved user profiling and a paradigm shift from ex-post log-based reconstruction of user activity to ex-ante declaration.
{"title":"Digital flight plans for server access control: Restricting anomalous activity with path-based declarations of intentions","authors":"R. Loui, Lucinda Caughey","doi":"10.1109/CCST.2016.7815705","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815705","url":null,"abstract":"In response to increasing threats of malicious activity and data loss on servers, we propose a different and practical strategy for access control modeled after flight plans for pilots, which mixes existing role-based, object-based, and intention-based access models; it supports much finer grained, real-time, sequence-oriented anomaly detection. Users are required to declare their intended “flight path” in advance, a sketch of resource use: this may vary in detail, but could include database tables, file system directories, byte and bandwidth limits, use of encryption and archive creation, command sets, connection time, number and origin of connections, and ports. Sequence information provides especially strong constraint, even if it incomplete. We find an important place for active, on-line human sampling of flight plans, as well as pre-authorization for non-standard paths, and alerts for deviation from path. We also find a place for improved user profiling and a paradigm shift from ex-post log-based reconstruction of user activity to ex-ante declaration.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"15 1","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84189438","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CCST.2016.7815701
Razieh Nokhbeh Zaeem, S. Budalakoti, K. Suzanne Barber, Muhibur Rasheed, C. Bajaj
Personally Identifiable Information (PII) is commonly used in both the physical and cyber worlds to perform personal authentication. A 2014 Department of Justice report estimated that roughly 7% of American households reported some type of identity theft in the previous year, involving the theft and fraudulent use of such PII. Establishing a comprehensive map of PII attributes and their relationships is a fundamental first step to protect users from identity theft. In this paper, we present the mathematical representation and implementation of a model of Personally Identifiable Information attributes for people, named Identity Ecosystem. Each PII attribute (e.g., name, age, and Social Security Number) is modeled as a graph node. Probabilistic relationships between PII attributes are modeled as graph edges. We have implemented this Identity Ecosystem model as a Bayesian Belief Network (with cycles allowed) and we use Gibb's Sampling to approximate the posteriors in our model. We populated the model from two sources of information: 1) actual theft and fraud cases; and 2) experts' estimates. We have utilized our Identity Ecosystem implementation to predict as well as to explain the risk of losing PII and the liability associated with fraudulent use of these PII attributes. For better human understanding of the complex identity ecosystem, we also provide a 3D visualization of the Identity Ecosystem model and queries executed on the model. This research aims to advance a fundamental understanding of PII attributes and leads to better methods for preventing identity theft and fraud.
个人身份信息(PII)通常用于物理和网络世界中执行个人身份验证。美国司法部(Department of Justice) 2014年的一份报告估计,大约7%的美国家庭在前一年报告了某种形式的身份盗窃,包括盗窃和欺诈性使用此类个人身份信息。建立PII属性及其关系的全面映射是保护用户免遭身份盗用的基本第一步。在本文中,我们提出了一个名为身份生态系统的个人可识别信息属性模型的数学表示和实现。每个PII属性(例如,姓名、年龄和社会安全号码)都被建模为一个图节点。PII属性之间的概率关系被建模为图边。我们将这个身份生态系统模型实现为贝叶斯信念网络(允许循环),并使用Gibb抽样来近似模型中的后验。我们从两个信息来源填充模型:1)实际的盗窃和欺诈案件;2)专家的估计。我们已经利用我们的身份生态系统实现来预测和解释丢失PII的风险以及与欺诈性使用这些PII属性相关的责任。为了更好地理解复杂的身份生态系统,我们还提供了身份生态系统模型的3D可视化和在模型上执行的查询。本研究旨在促进对个人身份信息属性的基本理解,并为防止身份盗窃和欺诈提供更好的方法。
{"title":"Predicting and explaining identity risk, exposure and cost using the ecosystem of identity attributes","authors":"Razieh Nokhbeh Zaeem, S. Budalakoti, K. Suzanne Barber, Muhibur Rasheed, C. Bajaj","doi":"10.1109/CCST.2016.7815701","DOIUrl":"https://doi.org/10.1109/CCST.2016.7815701","url":null,"abstract":"Personally Identifiable Information (PII) is commonly used in both the physical and cyber worlds to perform personal authentication. A 2014 Department of Justice report estimated that roughly 7% of American households reported some type of identity theft in the previous year, involving the theft and fraudulent use of such PII. Establishing a comprehensive map of PII attributes and their relationships is a fundamental first step to protect users from identity theft. In this paper, we present the mathematical representation and implementation of a model of Personally Identifiable Information attributes for people, named Identity Ecosystem. Each PII attribute (e.g., name, age, and Social Security Number) is modeled as a graph node. Probabilistic relationships between PII attributes are modeled as graph edges. We have implemented this Identity Ecosystem model as a Bayesian Belief Network (with cycles allowed) and we use Gibb's Sampling to approximate the posteriors in our model. We populated the model from two sources of information: 1) actual theft and fraud cases; and 2) experts' estimates. We have utilized our Identity Ecosystem implementation to predict as well as to explain the risk of losing PII and the liability associated with fraudulent use of these PII attributes. For better human understanding of the complex identity ecosystem, we also provide a 3D visualization of the Identity Ecosystem model and queries executed on the model. This research aims to advance a fundamental understanding of PII attributes and leads to better methods for preventing identity theft and fraud.","PeriodicalId":6510,"journal":{"name":"2016 IEEE International Carnahan Conference on Security Technology (ICCST)","volume":"75 1","pages":"1-8"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85719575","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}