Jesse Mullis, Cheng Chen, Scott Ferguson, Beshoy Morkos
{"title":"深度神经网络在自然语言处理中按来源和功能分类需求的有效性——BERT在系统需求中的应用","authors":"Jesse Mullis, Cheng Chen, Scott Ferguson, Beshoy Morkos","doi":"10.1115/1.4063764","DOIUrl":null,"url":null,"abstract":"Abstract Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art natural language processing (NLP) deep neural network model, to automatically analyze written requirements. Yet, it remains unclear whether BERT can sufficiently capture the nuances that differentiate requirements between and within design documents. This work evaluates BERT’s performance on two requirement classification tasks (one inter- document and one intra-document) executed on a corpus of 1,303 requirements sourced from five system design projects. First, in the “parent document classification” task, a BERT model is fine-tuned to classify requirements according to their originating project. A separate BERT model is then fine-tuned on a “functional classification” task where each requirement is classified as either functional or nonfunctional. Our results also include a comparison with a baseline model, Word2Vec, and demonstrate that our model achieves higher classification accuracy. When evaluated on test sets, the former model receives a Matthews correlation coefficient (MCC) of 0.95, while the latter receives an MCC of 0.82, indicating BERT’s ability to reliably distinguish requirements. This work then explores the application of BERT’s representations, known as embeddings, to identify similar requirements and predict requirement change.","PeriodicalId":50137,"journal":{"name":"Journal of Mechanical Design","volume":"6 8","pages":"0"},"PeriodicalIF":2.9000,"publicationDate":"2023-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficacy of Deep Neural Networks in Natural Language Processing for Classifying Requirements by Origin and Functionality: An Application of BERT in System Requirement\",\"authors\":\"Jesse Mullis, Cheng Chen, Scott Ferguson, Beshoy Morkos\",\"doi\":\"10.1115/1.4063764\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art natural language processing (NLP) deep neural network model, to automatically analyze written requirements. Yet, it remains unclear whether BERT can sufficiently capture the nuances that differentiate requirements between and within design documents. This work evaluates BERT’s performance on two requirement classification tasks (one inter- document and one intra-document) executed on a corpus of 1,303 requirements sourced from five system design projects. First, in the “parent document classification” task, a BERT model is fine-tuned to classify requirements according to their originating project. A separate BERT model is then fine-tuned on a “functional classification” task where each requirement is classified as either functional or nonfunctional. Our results also include a comparison with a baseline model, Word2Vec, and demonstrate that our model achieves higher classification accuracy. When evaluated on test sets, the former model receives a Matthews correlation coefficient (MCC) of 0.95, while the latter receives an MCC of 0.82, indicating BERT’s ability to reliably distinguish requirements. This work then explores the application of BERT’s representations, known as embeddings, to identify similar requirements and predict requirement change.\",\"PeriodicalId\":50137,\"journal\":{\"name\":\"Journal of Mechanical Design\",\"volume\":\"6 8\",\"pages\":\"0\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2023-11-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Mechanical Design\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1115/1.4063764\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, MECHANICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Mechanical Design","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/1.4063764","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MECHANICAL","Score":null,"Total":0}
Efficacy of Deep Neural Networks in Natural Language Processing for Classifying Requirements by Origin and Functionality: An Application of BERT in System Requirement
Abstract Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art natural language processing (NLP) deep neural network model, to automatically analyze written requirements. Yet, it remains unclear whether BERT can sufficiently capture the nuances that differentiate requirements between and within design documents. This work evaluates BERT’s performance on two requirement classification tasks (one inter- document and one intra-document) executed on a corpus of 1,303 requirements sourced from five system design projects. First, in the “parent document classification” task, a BERT model is fine-tuned to classify requirements according to their originating project. A separate BERT model is then fine-tuned on a “functional classification” task where each requirement is classified as either functional or nonfunctional. Our results also include a comparison with a baseline model, Word2Vec, and demonstrate that our model achieves higher classification accuracy. When evaluated on test sets, the former model receives a Matthews correlation coefficient (MCC) of 0.95, while the latter receives an MCC of 0.82, indicating BERT’s ability to reliably distinguish requirements. This work then explores the application of BERT’s representations, known as embeddings, to identify similar requirements and predict requirement change.
期刊介绍:
The Journal of Mechanical Design (JMD) serves the broad design community as the venue for scholarly, archival research in all aspects of the design activity with emphasis on design synthesis. JMD has traditionally served the ASME Design Engineering Division and its technical committees, but it welcomes contributions from all areas of design with emphasis on synthesis. JMD communicates original contributions, primarily in the form of research articles of considerable depth, but also technical briefs, design innovation papers, book reviews, and editorials.
Scope: The Journal of Mechanical Design (JMD) serves the broad design community as the venue for scholarly, archival research in all aspects of the design activity with emphasis on design synthesis. JMD has traditionally served the ASME Design Engineering Division and its technical committees, but it welcomes contributions from all areas of design with emphasis on synthesis. JMD communicates original contributions, primarily in the form of research articles of considerable depth, but also technical briefs, design innovation papers, book reviews, and editorials.