This paper examines skin tone detection algorithms used by first responder forensic tools such as File Hound. File Hound is a "field analysis" software application that is currently being used by over 100 law enforcement agencies, both internationally and domestically. It is mainly used in forensic investigations to search and identify pornographic images from a hard drive. Since the conception of File Hound, several steps have been taken to improve its performance and expand its features. One such feature is a skin tone detection filter that can identify images with a large skin color count from the aggregate image results found by File Hound. This filter is based on the idea that there is a positive correlation between images with a large skin color count and images that are pornographic in nature. A novel skin tone detection filter was developed and this filter was tested against random images obtained from the Compaq Image database for skin tone detection. The results of the test are encouraging in terms of accuracy and low error rates: type I = 20.64%, type II = 0.81%, accuracy = 78.55%.
{"title":"A Novel Skin Tone Detection Algorithm for Contraband Image Analysis","authors":"A. Choudhury, M. Rogers, W. Gillam, Keith Watson","doi":"10.1109/SADFE.2008.12","DOIUrl":"https://doi.org/10.1109/SADFE.2008.12","url":null,"abstract":"This paper examines skin tone detection algorithms used by first responder forensic tools such as File Hound. File Hound is a \"field analysis\" software application that is currently being used by over 100 law enforcement agencies, both internationally and domestically. It is mainly used in forensic investigations to search and identify pornographic images from a hard drive. Since the conception of File Hound, several steps have been taken to improve its performance and expand its features. One such feature is a skin tone detection filter that can identify images with a large skin color count from the aggregate image results found by File Hound. This filter is based on the idea that there is a positive correlation between images with a large skin color count and images that are pornographic in nature. A novel skin tone detection filter was developed and this filter was tested against random images obtained from the Compaq Image database for skin tone detection. The results of the test are encouraging in terms of accuracy and low error rates: type I = 20.64%, type II = 0.81%, accuracy = 78.55%.","PeriodicalId":391486,"journal":{"name":"2008 Third International Workshop on Systematic Approaches to Digital Forensic Engineering","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131324435","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Investigation of security incidents is of great importance as it allows to trace back the actions taken by the intruders. In this paper we develop a formal technique for digital investigation based on the use of Incident Response Probabilistic Cognitive Maps. Three main issues are addressed here: (1) construction and extraction of plausible known attack scenarios, (2) construction of hypothetical scenarios and their validation using a logic-based formalism, and (3) selection of optimal counter-measures addressing the detected attacks.
{"title":"Cognitive-Maps Based Investigation of Digital Security Incidents","authors":"S. Rekhis, J. Krichène, N. Boudriga","doi":"10.1109/SADFE.2008.20","DOIUrl":"https://doi.org/10.1109/SADFE.2008.20","url":null,"abstract":"Investigation of security incidents is of great importance as it allows to trace back the actions taken by the intruders. In this paper we develop a formal technique for digital investigation based on the use of Incident Response Probabilistic Cognitive Maps. Three main issues are addressed here: (1) construction and extraction of plausible known attack scenarios, (2) construction of hypothetical scenarios and their validation using a logic-based formalism, and (3) selection of optimal counter-measures addressing the detected attacks.","PeriodicalId":391486,"journal":{"name":"2008 Third International Workshop on Systematic Approaches to Digital Forensic Engineering","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123145747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Standard ways of calculating the similarity of different computer programs are needed in computer science. Such measurements can be useful in many different areas such as clone detection, refactoring, compiler optimization, and run-time optimization. Such standards are particularly important for uncovering plagiarism, trade secret theft, copyright infringement, and patent infringement. Other uses include locating open source code within a proprietary program and determining the authors of different programs. In a previous paper (R. Zeidman, 2006) I introduced the concept of source code correlation, presented a theoretical basis for such a measure, and described a program, CodeMatchreg, that compares software source code and calculates correlation. That paper compared the described method of source code correlation against existing methods of comparing source code and found it to be significantly superior. This paper refines that definition of source code correlation and presents a new, more robust, definition of multidimensional source code correlation.
{"title":"Multidimensional Correlation of Software Source Code","authors":"R. Zeidman","doi":"10.1109/SADFE.2008.9","DOIUrl":"https://doi.org/10.1109/SADFE.2008.9","url":null,"abstract":"Standard ways of calculating the similarity of different computer programs are needed in computer science. Such measurements can be useful in many different areas such as clone detection, refactoring, compiler optimization, and run-time optimization. Such standards are particularly important for uncovering plagiarism, trade secret theft, copyright infringement, and patent infringement. Other uses include locating open source code within a proprietary program and determining the authors of different programs. In a previous paper (R. Zeidman, 2006) I introduced the concept of source code correlation, presented a theoretical basis for such a measure, and described a program, CodeMatchreg, that compares software source code and calculates correlation. That paper compared the described method of source code correlation against existing methods of comparing source code and found it to be significantly superior. This paper refines that definition of source code correlation and presents a new, more robust, definition of multidimensional source code correlation.","PeriodicalId":391486,"journal":{"name":"2008 Third International Workshop on Systematic Approaches to Digital Forensic Engineering","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133110071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Losavio, D. Keeling, Adel Said Elmaghraby, George E. Higgins, J. Shutt
Network systems capture data about electronic activity in new, sometimes unprecedented forms. These new forms offer new, powerful tactical tools for investigations of electronic malfeasance under traditional leg al regulation of state power, particular that of Fourth Amendment limitations on police searches and seizures under the U.S. Constitution. But mis- appreciation of identity and authenticity issues with electronic data, particularly IP addresses and account numbers, raise issues of public policy, privacy and proper oversight network forensic investigations. The digital age uses digital facts, particularly alphanumerical identifiers used for addressing, hashing and authentication and identification in online transactions. These artifacts become the evidence supporting a state search or seizure Given the technical issues with evidence preservation and examination in electronic storage media, search warrants relating to computers may direct the seizure of computers and removal off-site for examination in a computer forensics facility. This can disrupt or even destroy records, objects and systems on those computers. This reliance on simple digital identification with minimal authentication further corrodes privacy and liberty rights in new ways. Technical security cannot protect privacy and security with such attitudes towards data. Security policy must extend into all domains of society. The challenge will be to establish a balance where courts set a stricter boundary for state searches and seizures based on electronic evidence of questionable reliability. As the United States v. Gourde court observed "We are acutely aware that the digital universe poses particular challenges with respect to the Fourth Amendment." That awareness still needs greater knowledge of the facts of identity and authenticity of electronic data as evidence, its mutability and evanescence, if the rights, liberties, and privacy of Americans are to be protected.
网络系统以新的、有时是前所未有的形式捕捉有关电子活动的数据。这些新表格提供了新的、强大的战术工具,用于在传统的国家权力法律监管下调查电子渎职行为,特别是根据美国宪法第四修正案对警察搜查和扣押的限制。但是,对电子数据(特别是IP地址和账号)的身份和真实性问题的错误认识,引发了公共政策、隐私和适当监督网络取证调查的问题。数字时代使用数字事实,特别是用于在线交易中的寻址、散列、身份验证和识别的字母数字标识符。鉴于电子存储介质中证据保存和检查的技术问题,与计算机有关的搜查令可能指示扣押计算机并将其移出现场,以便在计算机取证设施中进行检查。这可能会破坏甚至破坏这些计算机上的记录、对象和系统。这种对简单的数字身份验证的依赖以最小的身份验证进一步以新的方式侵蚀了隐私权和自由权。以这种对待数据的态度,技术安全无法保护隐私和安全。安全政策必须扩展到社会的各个领域。挑战将是建立一种平衡,法院为基于可靠性可疑的电子证据的国家搜查和扣押设定更严格的界限。正如“美国诉古尔德案”(United States v. Gourde)法院所观察到的那样,“我们敏锐地意识到,数字宇宙对第四修正案构成了特殊的挑战。”如果要保护美国人的权利、自由和隐私,这种意识仍然需要更多地了解作为证据的电子数据的身份和真实性、其可变性和易逝性等事实。
{"title":"Network Forensics: Network Data and State Seizures in the United States","authors":"M. Losavio, D. Keeling, Adel Said Elmaghraby, George E. Higgins, J. Shutt","doi":"10.1109/SADFE.2008.15","DOIUrl":"https://doi.org/10.1109/SADFE.2008.15","url":null,"abstract":"Network systems capture data about electronic activity in new, sometimes unprecedented forms. These new forms offer new, powerful tactical tools for investigations of electronic malfeasance under traditional leg al regulation of state power, particular that of Fourth Amendment limitations on police searches and seizures under the U.S. Constitution. But mis- appreciation of identity and authenticity issues with electronic data, particularly IP addresses and account numbers, raise issues of public policy, privacy and proper oversight network forensic investigations. The digital age uses digital facts, particularly alphanumerical identifiers used for addressing, hashing and authentication and identification in online transactions. These artifacts become the evidence supporting a state search or seizure Given the technical issues with evidence preservation and examination in electronic storage media, search warrants relating to computers may direct the seizure of computers and removal off-site for examination in a computer forensics facility. This can disrupt or even destroy records, objects and systems on those computers. This reliance on simple digital identification with minimal authentication further corrodes privacy and liberty rights in new ways. Technical security cannot protect privacy and security with such attitudes towards data. Security policy must extend into all domains of society. The challenge will be to establish a balance where courts set a stricter boundary for state searches and seizures based on electronic evidence of questionable reliability. As the United States v. Gourde court observed \"We are acutely aware that the digital universe poses particular challenges with respect to the Fourth Amendment.\" That awareness still needs greater knowledge of the facts of identity and authenticity of electronic data as evidence, its mutability and evanescence, if the rights, liberties, and privacy of Americans are to be protected.","PeriodicalId":391486,"journal":{"name":"2008 Third International Workshop on Systematic Approaches to Digital Forensic Engineering","volume":"363 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133937390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Different users apply computer forensic systems, models, and terminology in very different ways. They often make incompatible assumptions and reach different conclusions about the validity and accuracy of the methods they use to log, audit, and present forensic data. This is problematic, because these fields are related, and results from one can be meaningful to the others. We present several forensic systems and discuss situations in which they produce valid and accurate conclusions and also situations in which their accuracy is suspect. We also present forensic models and discuss areas in which they are useful and areas in which they could be augmented. Finally, we present some recommendations about how computer scientists, forensic practitioners, lawyers, and judges could build more complete models of forensics that take into account appropriate legal details and lead to scientifically valid forensic analysis.
{"title":"Computer Forensics in Forensis","authors":"S. Peisert, M. Bishop, K. Marzullo","doi":"10.1145/1368506.1368521","DOIUrl":"https://doi.org/10.1145/1368506.1368521","url":null,"abstract":"Different users apply computer forensic systems, models, and terminology in very different ways. They often make incompatible assumptions and reach different conclusions about the validity and accuracy of the methods they use to log, audit, and present forensic data. This is problematic, because these fields are related, and results from one can be meaningful to the others. We present several forensic systems and discuss situations in which they produce valid and accurate conclusions and also situations in which their accuracy is suspect. We also present forensic models and discuss areas in which they are useful and areas in which they could be augmented. Finally, we present some recommendations about how computer scientists, forensic practitioners, lawyers, and judges could build more complete models of forensics that take into account appropriate legal details and lead to scientifically valid forensic analysis.","PeriodicalId":391486,"journal":{"name":"2008 Third International Workshop on Systematic Approaches to Digital Forensic Engineering","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122459317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}