数字取证软件和自动化工具的偏见和公平性

Razaq Jinad, Khushi Gupta, Ecem Simsek, Bing Zhou
{"title":"数字取证软件和自动化工具的偏见和公平性","authors":"Razaq Jinad, Khushi Gupta, Ecem Simsek, Bing Zhou","doi":"10.20517/jsss.2023.41","DOIUrl":null,"url":null,"abstract":"The proliferation of software tools and automated techniques in digital forensics has brought about some controversies regarding bias and fairness. Different biases exist and have been proven in some civil and criminal cases. In our research, we analyze and discuss these biases present in software tools and automation systems used by law enforcement organizations and in court proceedings. Furthermore, we present real-life cases and scenarios where some of these biases have determined or influenced these cases. We were also able to provide recommendations for reducing bias in software tools, which we hope will be the foundation for a framework that reduces or eliminates bias from software tools used in digital forensics. In conclusion, we anticipate that this research can help increase validation in digital forensics software tools and ensure users' trust in the tools and automation techniques.","PeriodicalId":509397,"journal":{"name":"Journal of Surveillance, Security and Safety","volume":"5 2","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bias and fairness in software and automation tools in digital forensics\",\"authors\":\"Razaq Jinad, Khushi Gupta, Ecem Simsek, Bing Zhou\",\"doi\":\"10.20517/jsss.2023.41\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The proliferation of software tools and automated techniques in digital forensics has brought about some controversies regarding bias and fairness. Different biases exist and have been proven in some civil and criminal cases. In our research, we analyze and discuss these biases present in software tools and automation systems used by law enforcement organizations and in court proceedings. Furthermore, we present real-life cases and scenarios where some of these biases have determined or influenced these cases. We were also able to provide recommendations for reducing bias in software tools, which we hope will be the foundation for a framework that reduces or eliminates bias from software tools used in digital forensics. In conclusion, we anticipate that this research can help increase validation in digital forensics software tools and ensure users' trust in the tools and automation techniques.\",\"PeriodicalId\":509397,\"journal\":{\"name\":\"Journal of Surveillance, Security and Safety\",\"volume\":\"5 2\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-01-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Surveillance, Security and Safety\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.20517/jsss.2023.41\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Surveillance, Security and Safety","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20517/jsss.2023.41","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

软件工具和自动化技术在数字取证领域的普及引发了一些关于偏见和公平性的争议。在一些民事和刑事案件中,存在并证明了不同的偏见。在我们的研究中,我们分析并讨论了执法机构和法庭程序中使用的软件工具和自动化系统中存在的这些偏见。此外,我们还介绍了现实生活中的案例和场景,其中一些偏见已经决定或影响了这些案件。我们还能够为减少软件工具中的偏见提供建议,希望这些建议能够成为减少或消除数字取证中所用软件工具偏见的框架的基础。总之,我们希望这项研究能有助于提高数字取证软件工具的验证,确保用户对工具和自动化技术的信任。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Bias and fairness in software and automation tools in digital forensics
The proliferation of software tools and automated techniques in digital forensics has brought about some controversies regarding bias and fairness. Different biases exist and have been proven in some civil and criminal cases. In our research, we analyze and discuss these biases present in software tools and automation systems used by law enforcement organizations and in court proceedings. Furthermore, we present real-life cases and scenarios where some of these biases have determined or influenced these cases. We were also able to provide recommendations for reducing bias in software tools, which we hope will be the foundation for a framework that reduces or eliminates bias from software tools used in digital forensics. In conclusion, we anticipate that this research can help increase validation in digital forensics software tools and ensure users' trust in the tools and automation techniques.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
TENNER: intrusion detection models for industrial networks based on ensemble learning Improved differential fault analysis of Grain128-AEAD A survey on wireless-communication vulnerabilities of ERTMS in the railway sector A TPRF-based pseudo-random number generator Bias and fairness in software and automation tools in digital forensics
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1