{"title":"使用注释从现有文档开发验收测试:实验","authors":"David Connolly, Frank Keenan, F. McCaffery","doi":"10.1109/IWAST.2009.5069050","DOIUrl":null,"url":null,"abstract":"The importance of good software testing is often reported. Traditionally, acceptance testing is the last stage of the testing process before release to the customer. Unfortunately, it is not always appropriate to wait so long for customer feedback. Emerging agile methods recognise this and promote close interaction between the customer and developers for early acceptance testing, often before implementation commences. Indeed, Acceptance Test Driven Development (ATDD) is a process that uses customer interaction to define tests and tool support to automate and execute these. However, with existing tools, tests are usually written from new descriptions or rewritten from existing documentation. Here, the challenge is to allow developers and customers to annotate existing documentation and automatically generate acceptance tests without rewrites or new descriptions. This paper introduces the related ideas and describes a particular experiment that assesses the value of using annotated text to create acceptance tests.","PeriodicalId":401585,"journal":{"name":"2009 ICSE Workshop on Automation of Software Test","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Developing acceptance tests from existing documentation using annotations: An experiment\",\"authors\":\"David Connolly, Frank Keenan, F. McCaffery\",\"doi\":\"10.1109/IWAST.2009.5069050\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The importance of good software testing is often reported. Traditionally, acceptance testing is the last stage of the testing process before release to the customer. Unfortunately, it is not always appropriate to wait so long for customer feedback. Emerging agile methods recognise this and promote close interaction between the customer and developers for early acceptance testing, often before implementation commences. Indeed, Acceptance Test Driven Development (ATDD) is a process that uses customer interaction to define tests and tool support to automate and execute these. However, with existing tools, tests are usually written from new descriptions or rewritten from existing documentation. Here, the challenge is to allow developers and customers to annotate existing documentation and automatically generate acceptance tests without rewrites or new descriptions. This paper introduces the related ideas and describes a particular experiment that assesses the value of using annotated text to create acceptance tests.\",\"PeriodicalId\":401585,\"journal\":{\"name\":\"2009 ICSE Workshop on Automation of Software Test\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-05-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 ICSE Workshop on Automation of Software Test\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IWAST.2009.5069050\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 ICSE Workshop on Automation of Software Test","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWAST.2009.5069050","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Developing acceptance tests from existing documentation using annotations: An experiment
The importance of good software testing is often reported. Traditionally, acceptance testing is the last stage of the testing process before release to the customer. Unfortunately, it is not always appropriate to wait so long for customer feedback. Emerging agile methods recognise this and promote close interaction between the customer and developers for early acceptance testing, often before implementation commences. Indeed, Acceptance Test Driven Development (ATDD) is a process that uses customer interaction to define tests and tool support to automate and execute these. However, with existing tools, tests are usually written from new descriptions or rewritten from existing documentation. Here, the challenge is to allow developers and customers to annotate existing documentation and automatically generate acceptance tests without rewrites or new descriptions. This paper introduces the related ideas and describes a particular experiment that assesses the value of using annotated text to create acceptance tests.