LASR:用于大规模注释软件需求的工具

I. Hussain, O. Ormandjieva, Leila Kosseim
{"title":"LASR:用于大规模注释软件需求的工具","authors":"I. Hussain, O. Ormandjieva, Leila Kosseim","doi":"10.1109/EmpiRE.2012.6347683","DOIUrl":null,"url":null,"abstract":"Annotation of software requirements documents is performed by experts during the requirements analysis phase to extract crucial knowledge from informally written textual requirements. Different annotation tasks target the extraction of different types of information and require the availability of experts specialized in the field. Large scale annotation tasks require multiple experts where the limited number of experts can make the tasks overwhelming and very costly without proper tool support. In this paper, we present our annotation tool, LASR, that can aid the tasks of requirements analysis by attaining more accurate annotations. Our evaluation of the tool demonstrate that the annotation data collected by LASR from the trained non-experts can help compute gold-standard annotations that strongly agree with the true gold-standards set by the experts, and therefore eliminate the need of conducting costly adjudication sessions for large scale annotation work.","PeriodicalId":335310,"journal":{"name":"2012 Second IEEE International Workshop on Empirical Requirements Engineering (EmpiRE)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"LASR: A tool for large scale annotation of software requirements\",\"authors\":\"I. Hussain, O. Ormandjieva, Leila Kosseim\",\"doi\":\"10.1109/EmpiRE.2012.6347683\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Annotation of software requirements documents is performed by experts during the requirements analysis phase to extract crucial knowledge from informally written textual requirements. Different annotation tasks target the extraction of different types of information and require the availability of experts specialized in the field. Large scale annotation tasks require multiple experts where the limited number of experts can make the tasks overwhelming and very costly without proper tool support. In this paper, we present our annotation tool, LASR, that can aid the tasks of requirements analysis by attaining more accurate annotations. Our evaluation of the tool demonstrate that the annotation data collected by LASR from the trained non-experts can help compute gold-standard annotations that strongly agree with the true gold-standards set by the experts, and therefore eliminate the need of conducting costly adjudication sessions for large scale annotation work.\",\"PeriodicalId\":335310,\"journal\":{\"name\":\"2012 Second IEEE International Workshop on Empirical Requirements Engineering (EmpiRE)\",\"volume\":\"2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-11-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 Second IEEE International Workshop on Empirical Requirements Engineering (EmpiRE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/EmpiRE.2012.6347683\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 Second IEEE International Workshop on Empirical Requirements Engineering (EmpiRE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EmpiRE.2012.6347683","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

软件需求文档的注释由专家在需求分析阶段执行,以从非正式的书面文本需求中提取关键知识。不同的注释任务针对不同类型的信息的提取,并且需要该领域的专家。大规模注释任务需要多个专家,而如果没有适当的工具支持,有限的专家数量可能会使任务不堪重负,而且成本非常高。在本文中,我们介绍了我们的注释工具LASR,它可以通过获得更准确的注释来帮助完成需求分析任务。我们对该工具的评估表明,LASR从训练有素的非专家那里收集的注释数据可以帮助计算出与专家设定的真正金标准非常一致的金标准注释,从而消除了对大规模注释工作进行昂贵的裁决会议的需要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
LASR: A tool for large scale annotation of software requirements
Annotation of software requirements documents is performed by experts during the requirements analysis phase to extract crucial knowledge from informally written textual requirements. Different annotation tasks target the extraction of different types of information and require the availability of experts specialized in the field. Large scale annotation tasks require multiple experts where the limited number of experts can make the tasks overwhelming and very costly without proper tool support. In this paper, we present our annotation tool, LASR, that can aid the tasks of requirements analysis by attaining more accurate annotations. Our evaluation of the tool demonstrate that the annotation data collected by LASR from the trained non-experts can help compute gold-standard annotations that strongly agree with the true gold-standards set by the experts, and therefore eliminate the need of conducting costly adjudication sessions for large scale annotation work.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Review improvement by requirements classification at Mercedes-Benz: Limits of empirical studies in educational environments Revealing the obvious?: A retrospective artefact analysis for an ambient assisted-living project Towards customer-based requirements engineering practices Investigating the usefulness of notations in the context of requirements engineering Assessing a requirements evolution approach: Empirical studies in the Air Traffic Management domain
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1