A Toolkit for Testing Stochastic Simulations against Statistical Oracles

Matthew Patrick, R. Donnelly, C. Gilligan
{"title":"A Toolkit for Testing Stochastic Simulations against Statistical Oracles","authors":"Matthew Patrick, R. Donnelly, C. Gilligan","doi":"10.1109/ICST.2017.50","DOIUrl":null,"url":null,"abstract":"Stochastic simulations are developed and employed across many fields, to advise governmental policy decisions and direct future research. Faulty simulation software can have serious consequences, but its correctness is difficult to determine due to complexity and random behaviour. Stochastic simulations may output a different result each time they are run, whereas most testing techniques are designed for programs which (for a given set of inputs) always produce the same behaviour. In this paper, we introduce a new approach towards testing stochastic simulations using statistical oracles and transition probabilities. Our approach was implemented as a toolkit, which allows the frequency of state transitions to be tested, along with their final output distribution. We evaluated our toolkit on eight simulation programs from a variety fields and found it can detect errors at least three times smaller (and in one case, over 1000 times smaller) than a conventional (tolerance threshold) approach.","PeriodicalId":112258,"journal":{"name":"2017 IEEE International Conference on Software Testing, Verification and Validation (ICST)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Software Testing, Verification and Validation (ICST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICST.2017.50","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Stochastic simulations are developed and employed across many fields, to advise governmental policy decisions and direct future research. Faulty simulation software can have serious consequences, but its correctness is difficult to determine due to complexity and random behaviour. Stochastic simulations may output a different result each time they are run, whereas most testing techniques are designed for programs which (for a given set of inputs) always produce the same behaviour. In this paper, we introduce a new approach towards testing stochastic simulations using statistical oracles and transition probabilities. Our approach was implemented as a toolkit, which allows the frequency of state transitions to be tested, along with their final output distribution. We evaluated our toolkit on eight simulation programs from a variety fields and found it can detect errors at least three times smaller (and in one case, over 1000 times smaller) than a conventional (tolerance threshold) approach.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一个针对统计预言测试随机模拟的工具包
随机模拟在许多领域得到发展和应用,为政府决策提供建议和指导未来的研究。错误的仿真软件可能会造成严重的后果,但由于其复杂性和随机性,很难确定其正确性。随机模拟每次运行时可能会输出不同的结果,而大多数测试技术是为(给定一组输入)总是产生相同行为的程序设计的。在本文中,我们介绍了一种使用统计预言和转移概率来测试随机模拟的新方法。我们的方法是作为一个工具包实现的,它允许测试状态转换的频率,以及它们的最终输出分布。我们在来自不同领域的八个模拟程序上评估了我们的工具包,发现它可以检测到比传统(公差阈值)方法至少小三倍(在一个案例中,小1000倍以上)的错误。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
The Theory of Composite Faults Symbolic Complexity Analysis Using Context-Preserving Histories Using Delta Debugging to Minimize Stress Tests for Concurrent Data Structures Private API Access and Functional Mocking in Automated Unit Test Generation Automata Language Equivalence vs. Simulations for Model-Based Mutant Equivalence: An Empirical Evaluation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1