坚持治疗会影响 TDD 的实验结果吗?

IF 6.5 1区 计算机科学 Q1 COMPUTER SCIENCE, SOFTWARE ENGINEERING IEEE Transactions on Software Engineering Pub Date : 2024-11-15 DOI:10.1109/TSE.2024.3497332
Itir Karac;Jose Ignacio Panach;Burak Turhan;Natalia Juristo
{"title":"坚持治疗会影响 TDD 的实验结果吗?","authors":"Itir Karac;Jose Ignacio Panach;Burak Turhan;Natalia Juristo","doi":"10.1109/TSE.2024.3497332","DOIUrl":null,"url":null,"abstract":"<bold>Context:</b>\n In software engineering (SE) experiments, the way in which a treatment is applied could affect results. Different interpretations of how to apply the treatment and decisions on treatment adherence could lead to different results when data are analysed. \n<bold>Objective:</b>\n This paper aims to study whether treatment adherence has an impact on the results of an SE experiment. \n<bold>Method:</b>\n The experiment used as test case for our research uses Test-Driven Development (TDD) and Incremental Test-Last Development, (ITLD) as treatments. We reported elsewhere the design and results of such an experiment where 24 participants were recruited from industry. Here, we compare experiment results depending on the use of data from adherent participants or data from all the participants irrespective of their adherence to treatments. \n<bold>Results:</b>\n Only 40% of the participants adhere to both TDD protocol and to the ITLD protocol; 27% never followed TDD; 20% used TDD even in the control group; 13% are defiers (used TDD in ITLD session but not in TDD session). Considering that both TDD and ITLD are less complex than other SE methods, we can hypothesize that more complex SE techniques could get even lower adherence to the treatment. \n<bold>Conclusion:</b>\n Both TDD and ITLD are applied differently across participants. Training participants could not be enough to ensure a medium to large adherence of experiment participants. Adherence to treatments impacts results and should not be taken for granted in SE experiments.","PeriodicalId":13324,"journal":{"name":"IEEE Transactions on Software Engineering","volume":"51 1","pages":"135-152"},"PeriodicalIF":6.5000,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10754655","citationCount":"0","resultStr":"{\"title\":\"Does Treatment Adherence Impact Experiment Results in TDD?\",\"authors\":\"Itir Karac;Jose Ignacio Panach;Burak Turhan;Natalia Juristo\",\"doi\":\"10.1109/TSE.2024.3497332\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<bold>Context:</b>\\n In software engineering (SE) experiments, the way in which a treatment is applied could affect results. Different interpretations of how to apply the treatment and decisions on treatment adherence could lead to different results when data are analysed. \\n<bold>Objective:</b>\\n This paper aims to study whether treatment adherence has an impact on the results of an SE experiment. \\n<bold>Method:</b>\\n The experiment used as test case for our research uses Test-Driven Development (TDD) and Incremental Test-Last Development, (ITLD) as treatments. We reported elsewhere the design and results of such an experiment where 24 participants were recruited from industry. Here, we compare experiment results depending on the use of data from adherent participants or data from all the participants irrespective of their adherence to treatments. \\n<bold>Results:</b>\\n Only 40% of the participants adhere to both TDD protocol and to the ITLD protocol; 27% never followed TDD; 20% used TDD even in the control group; 13% are defiers (used TDD in ITLD session but not in TDD session). Considering that both TDD and ITLD are less complex than other SE methods, we can hypothesize that more complex SE techniques could get even lower adherence to the treatment. \\n<bold>Conclusion:</b>\\n Both TDD and ITLD are applied differently across participants. Training participants could not be enough to ensure a medium to large adherence of experiment participants. Adherence to treatments impacts results and should not be taken for granted in SE experiments.\",\"PeriodicalId\":13324,\"journal\":{\"name\":\"IEEE Transactions on Software Engineering\",\"volume\":\"51 1\",\"pages\":\"135-152\"},\"PeriodicalIF\":6.5000,\"publicationDate\":\"2024-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10754655\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Software Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10754655/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Software Engineering","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10754655/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

摘要

背景:在软件工程(SE)实验中,应用处理的方式可能会影响结果。在分析数据时,对如何应用治疗和对治疗依从性的决定的不同解释可能导致不同的结果。目的:研究治疗依从性是否对SE实验结果有影响。方法:实验作为我们研究的测试用例,使用测试驱动开发(TDD)和增量测试最后开发(ITLD)作为处理方法。我们在其他地方报道了这样一个实验的设计和结果,该实验从工业界招募了24名参与者。在这里,我们根据使用来自坚持治疗的参与者的数据或来自所有参与者的数据来比较实验结果,而不管他们是否坚持治疗。结果:只有40%的参与者同时遵守TDD协议和ITLD协议;27%的人从未遵循TDD;对照组也有20%使用TDD;13%是定义者(在ITLD会话中使用TDD,但不在TDD会话中使用)。考虑到TDD和ITLD都没有其他SE方法那么复杂,我们可以假设更复杂的SE技术可以获得更低的治疗依从性。结论:TDD和ITLD在参与者中的应用不同。培训参与者不足以确保实验参与者的中等到较大的依从性。坚持治疗会影响结果,在SE实验中不应被认为是理所当然的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Does Treatment Adherence Impact Experiment Results in TDD?
Context: In software engineering (SE) experiments, the way in which a treatment is applied could affect results. Different interpretations of how to apply the treatment and decisions on treatment adherence could lead to different results when data are analysed. Objective: This paper aims to study whether treatment adherence has an impact on the results of an SE experiment. Method: The experiment used as test case for our research uses Test-Driven Development (TDD) and Incremental Test-Last Development, (ITLD) as treatments. We reported elsewhere the design and results of such an experiment where 24 participants were recruited from industry. Here, we compare experiment results depending on the use of data from adherent participants or data from all the participants irrespective of their adherence to treatments. Results: Only 40% of the participants adhere to both TDD protocol and to the ITLD protocol; 27% never followed TDD; 20% used TDD even in the control group; 13% are defiers (used TDD in ITLD session but not in TDD session). Considering that both TDD and ITLD are less complex than other SE methods, we can hypothesize that more complex SE techniques could get even lower adherence to the treatment. Conclusion: Both TDD and ITLD are applied differently across participants. Training participants could not be enough to ensure a medium to large adherence of experiment participants. Adherence to treatments impacts results and should not be taken for granted in SE experiments.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Software Engineering
IEEE Transactions on Software Engineering 工程技术-工程:电子与电气
CiteScore
9.70
自引率
10.80%
发文量
724
审稿时长
6 months
期刊介绍: IEEE Transactions on Software Engineering seeks contributions comprising well-defined theoretical results and empirical studies with potential impacts on software construction, analysis, or management. The scope of this Transactions extends from fundamental mechanisms to the development of principles and their application in specific environments. Specific topic areas include: a) Development and maintenance methods and models: Techniques and principles for specifying, designing, and implementing software systems, encompassing notations and process models. b) Assessment methods: Software tests, validation, reliability models, test and diagnosis procedures, software redundancy, design for error control, and measurements and evaluation of process and product aspects. c) Software project management: Productivity factors, cost models, schedule and organizational issues, and standards. d) Tools and environments: Specific tools, integrated tool environments, associated architectures, databases, and parallel and distributed processing issues. e) System issues: Hardware-software trade-offs. f) State-of-the-art surveys: Syntheses and comprehensive reviews of the historical development within specific areas of interest.
期刊最新文献
Design and Assurance of Control Software Influence of the 1990 IEEE TSE Paper “Automated Software Test Data Generation” on Software Engineering A Reflection on Change Classification in the Era of Large Language Models Decision Support for Selecting Blockchain-Based Application Design Patterns with Layered Taxonomy and Quality Attributes A Retrospective on Whole Test Suite Generation: On the Role of SBST in the Age of LLMs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1