Pub Date : 2023-07-11DOI: 10.3390/computers12070138
Elçin Yazıcı Arıcı, M. Kalogiannakis, Stamatios Papadakis
Preschoolers now play digital games on touch screens, e-toys and electronic learning systems. Although digital games have an important place in children’s lives, there needs to be more information about the meanings they attach to games. In this context, the research aims to determine the perceptions of preschool children studying in different regions of Turkey regarding digital games with the help of metaphors. Four hundred twenty-one preschool children studying in seven regions of Turkey participated in the research. The data were collected through the “Digital Game Metaphor Form” to determine children’s perceptions of digital games and through “Drawing and Visualization”, which comprises the symbolic pictures children draw of their feelings and thoughts. Phenomenology, a qualitative research model, was used in this study. The data were analyzed using the content analysis method. When the data were evaluated, the children had produced 421 metaphors collected in the following seven categories: “Nature Images, Technology Images, Fantasy/Supernatural Images, Education Images, Affective/Motivational Images, Struggle Images, and Value Images”. When evaluated based on regions, the Black Sea Region ranked first in the “Fantasy/Supernatural Images and Affective/Motivational Images” categories. In contrast, the Central Anatolia Region ranked first in the “Technology Images and Education Images” categories, and the Marmara Region ranked first in the “Nature Images and Value Images” categories. In addition, it was determined that the Southeast Anatolia Region ranks first in the “Struggle Images” category.
{"title":"Preschool Children's Metaphoric Perceptions of Digital Games: A Comparison between Regions","authors":"Elçin Yazıcı Arıcı, M. Kalogiannakis, Stamatios Papadakis","doi":"10.3390/computers12070138","DOIUrl":"https://doi.org/10.3390/computers12070138","url":null,"abstract":"Preschoolers now play digital games on touch screens, e-toys and electronic learning systems. Although digital games have an important place in children’s lives, there needs to be more information about the meanings they attach to games. In this context, the research aims to determine the perceptions of preschool children studying in different regions of Turkey regarding digital games with the help of metaphors. Four hundred twenty-one preschool children studying in seven regions of Turkey participated in the research. The data were collected through the “Digital Game Metaphor Form” to determine children’s perceptions of digital games and through “Drawing and Visualization”, which comprises the symbolic pictures children draw of their feelings and thoughts. Phenomenology, a qualitative research model, was used in this study. The data were analyzed using the content analysis method. When the data were evaluated, the children had produced 421 metaphors collected in the following seven categories: “Nature Images, Technology Images, Fantasy/Supernatural Images, Education Images, Affective/Motivational Images, Struggle Images, and Value Images”. When evaluated based on regions, the Black Sea Region ranked first in the “Fantasy/Supernatural Images and Affective/Motivational Images” categories. In contrast, the Central Anatolia Region ranked first in the “Technology Images and Education Images” categories, and the Marmara Region ranked first in the “Nature Images and Value Images” categories. In addition, it was determined that the Southeast Anatolia Region ranks first in the “Struggle Images” category.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"64 1","pages":"138"},"PeriodicalIF":0.0,"publicationDate":"2023-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84814043","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-10DOI: 10.3390/computation11070139
Hossam A. Gabbar, O. Adegboro, Abderrazak Chahid, Jing Ren
In a nuclear power plant (NPP), the used tools are visually inspected to ensure their integrity before and after their use in the nuclear reactor. The manual inspection is usually performed by qualified technicians and takes a large amount of time (weeks up to months). In this work, we propose an automated tool inspection that uses a classification model for anomaly detection. The deep learning model classifies the computed tomography (CT) images as defective (with missing components) or defect-free. Moreover, the proposed algorithm enables incremental learning (IL) using a proposed thresholding technique to ensure a high prediction confidence by continuous online training of the deployed online anomaly detection model. The proposed algorithm is tested with existing state-of-the-art IL methods showing that it helps the model quickly learn the anomaly patterns. In addition, it enhances the classification model confidence while preserving a desired minimal performance.
{"title":"Incremental Learning-Based Algorithm for Anomaly Detection Using Computed Tomography Data","authors":"Hossam A. Gabbar, O. Adegboro, Abderrazak Chahid, Jing Ren","doi":"10.3390/computation11070139","DOIUrl":"https://doi.org/10.3390/computation11070139","url":null,"abstract":"In a nuclear power plant (NPP), the used tools are visually inspected to ensure their integrity before and after their use in the nuclear reactor. The manual inspection is usually performed by qualified technicians and takes a large amount of time (weeks up to months). In this work, we propose an automated tool inspection that uses a classification model for anomaly detection. The deep learning model classifies the computed tomography (CT) images as defective (with missing components) or defect-free. Moreover, the proposed algorithm enables incremental learning (IL) using a proposed thresholding technique to ensure a high prediction confidence by continuous online training of the deployed online anomaly detection model. The proposed algorithm is tested with existing state-of-the-art IL methods showing that it helps the model quickly learn the anomaly patterns. In addition, it enhances the classification model confidence while preserving a desired minimal performance.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"48 1","pages":"139"},"PeriodicalIF":0.0,"publicationDate":"2023-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77537152","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-10DOI: 10.3390/computation11070138
G. Krivovichev, Elena S. Bezrukova
This paper is devoted to the comparison of discrete velocity models used for simulation of compressible flows with arbitrary specific heat ratios in the lattice Boltzmann method. The stability of the governing equations is analyzed for the steady flow regime. A technique for the construction of stability domains in parametric space based on the analysis of eigenvalues is proposed. A comparison of stability domains for different models is performed. It is demonstrated that the maximum value of macrovelocity, which defines instability initiation, is dependent on the values of relaxation time, and plots of this dependence are constructed. For double-distribution-function models, it is demonstrated that the value of the Prantdl number does not seriously affect stability. The off-lattice parametric finite-difference scheme is proposed for the practical realization of the considered kinetic models. The Riemann problems and the problem of Kelvin–Helmholtz instability simulation are numerically solved. It is demonstrated that different models lead to close numerical results. The proposed technique of stability investigation can be used as an effective tool for the theoretical comparison of different kinetic models used in applications of the lattice Boltzmann method.
{"title":"Analysis of Discrete Velocity Models for Lattice Boltzmann Simulations of Compressible Flows at Arbitrary Specific Heat Ratio","authors":"G. Krivovichev, Elena S. Bezrukova","doi":"10.3390/computation11070138","DOIUrl":"https://doi.org/10.3390/computation11070138","url":null,"abstract":"This paper is devoted to the comparison of discrete velocity models used for simulation of compressible flows with arbitrary specific heat ratios in the lattice Boltzmann method. The stability of the governing equations is analyzed for the steady flow regime. A technique for the construction of stability domains in parametric space based on the analysis of eigenvalues is proposed. A comparison of stability domains for different models is performed. It is demonstrated that the maximum value of macrovelocity, which defines instability initiation, is dependent on the values of relaxation time, and plots of this dependence are constructed. For double-distribution-function models, it is demonstrated that the value of the Prantdl number does not seriously affect stability. The off-lattice parametric finite-difference scheme is proposed for the practical realization of the considered kinetic models. The Riemann problems and the problem of Kelvin–Helmholtz instability simulation are numerically solved. It is demonstrated that different models lead to close numerical results. The proposed technique of stability investigation can be used as an effective tool for the theoretical comparison of different kinetic models used in applications of the lattice Boltzmann method.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"361 1","pages":"138"},"PeriodicalIF":0.0,"publicationDate":"2023-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77432196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-10DOI: 10.3390/computation11070137
Mohamed Karim Hajji, Hatem Hadda, N. Dridi
This paper presents a comprehensive approach for minimizing makespan in the challenging two-stage hybrid flowshop with dedicated machines, a problem known to be strongly NP-hard. This study proposed a constraint programming approach, a novel heuristic based on a priority rule, and Tabu search procedures to tackle this optimization problem. The constraint programming model, implemented using a commercial solver, serves as the exact resolution method, while the heuristic and Tabu search explore approximate solutions simultaneously. The motivation behind this research is the need to address the complexities of scheduling problems in the context of two-stage hybrid flowshop with dedicated machines. This problem presents significant challenges due to its NP-hard nature and the need for efficient optimization techniques. The contribution of this study lies in the development of an integrated approach that combines constraint programming, a novel heuristic, and Tabu search to provide a comprehensive and efficient solution. The proposed constraint programming model offers exact resolution capabilities, while the heuristic and Tabu search provide approximate solutions, offering a balance between accuracy and efficiency. To enhance the search process, the research introduces effective elimination rules, which reduce the search space and simplify the search effort. This approach improves the overall optimization performance and contributes to finding high-quality solutions. The results demonstrate the effectiveness of the proposed approach. The heuristic approach achieves complete success in solving all instances for specific classes, showcasing its practical applicability. Furthermore, the constraint programming model exhibits exceptional efficiency, successfully solving problems with up to n=500 jobs. This efficiency is noteworthy compared to instances solved by other exact solution approaches, indicating the scalability and effectiveness of the proposed method.
{"title":"Makespan Minimization for the Two-Stage Hybrid Flow Shop Problem with Dedicated Machines: A Comprehensive Study of Exact and Heuristic Approaches","authors":"Mohamed Karim Hajji, Hatem Hadda, N. Dridi","doi":"10.3390/computation11070137","DOIUrl":"https://doi.org/10.3390/computation11070137","url":null,"abstract":"This paper presents a comprehensive approach for minimizing makespan in the challenging two-stage hybrid flowshop with dedicated machines, a problem known to be strongly NP-hard. This study proposed a constraint programming approach, a novel heuristic based on a priority rule, and Tabu search procedures to tackle this optimization problem. The constraint programming model, implemented using a commercial solver, serves as the exact resolution method, while the heuristic and Tabu search explore approximate solutions simultaneously. The motivation behind this research is the need to address the complexities of scheduling problems in the context of two-stage hybrid flowshop with dedicated machines. This problem presents significant challenges due to its NP-hard nature and the need for efficient optimization techniques. The contribution of this study lies in the development of an integrated approach that combines constraint programming, a novel heuristic, and Tabu search to provide a comprehensive and efficient solution. The proposed constraint programming model offers exact resolution capabilities, while the heuristic and Tabu search provide approximate solutions, offering a balance between accuracy and efficiency. To enhance the search process, the research introduces effective elimination rules, which reduce the search space and simplify the search effort. This approach improves the overall optimization performance and contributes to finding high-quality solutions. The results demonstrate the effectiveness of the proposed approach. The heuristic approach achieves complete success in solving all instances for specific classes, showcasing its practical applicability. Furthermore, the constraint programming model exhibits exceptional efficiency, successfully solving problems with up to n=500 jobs. This efficiency is noteworthy compared to instances solved by other exact solution approaches, indicating the scalability and effectiveness of the proposed method.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"6 1","pages":"137"},"PeriodicalIF":0.0,"publicationDate":"2023-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86034098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-09DOI: 10.3390/computation11070136
Natalia Menshutina, Andrey Abramov, E. Mokhova
This paper presents modern methods of mathematical modeling, which are widely used in the development of new inhalation and intranasal drugs, including those necessary for the treatment of socially significant diseases, which include: tuberculosis, bronchial asthma, and mental and behavioral disorders. Based on the conducted studies, it was revealed that the methods of mathematical modeling used in the development of drugs are fragmented, and there is no single approach that would combine the existing methods. The results presented in the work should contribute to the development of a unified multiscale model as a new approach in mathematical modeling that contributes to the accelerated development and introduction to the market of new drugs with high bioavailability and the required therapeutic efficacy.
{"title":"Mathematical and Computer Modeling as a Novel Approach for the Accelerated Development of New Inhalation and Intranasal Drug Delivery Systems","authors":"Natalia Menshutina, Andrey Abramov, E. Mokhova","doi":"10.3390/computation11070136","DOIUrl":"https://doi.org/10.3390/computation11070136","url":null,"abstract":"This paper presents modern methods of mathematical modeling, which are widely used in the development of new inhalation and intranasal drugs, including those necessary for the treatment of socially significant diseases, which include: tuberculosis, bronchial asthma, and mental and behavioral disorders. Based on the conducted studies, it was revealed that the methods of mathematical modeling used in the development of drugs are fragmented, and there is no single approach that would combine the existing methods. The results presented in the work should contribute to the development of a unified multiscale model as a new approach in mathematical modeling that contributes to the accelerated development and introduction to the market of new drugs with high bioavailability and the required therapeutic efficacy.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"162 1","pages":"136"},"PeriodicalIF":0.0,"publicationDate":"2023-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88188763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-08DOI: 10.3390/computers12070137
Saima Khosa, A. Mehmood, Muhammad Rizwan
The study focuses on news category prediction and investigates the performance of sentence embedding of four transformer models (BERT, RoBERTa, MPNet, and T5) and their variants as feature vectors when combined with Softmax and Random Forest using two accessible news datasets from Kaggle. The data are stratified into train and test sets to ensure equal representation of each category. Word embeddings are generated using transformer models, with the last hidden layer selected as the embedding. Mean pooling calculates a single vector representation called sentence embedding, capturing the overall meaning of the news article. The performance of Softmax and Random Forest, as well as the soft voting of both, is evaluated using evaluation measures such as accuracy, F1 score, precision, and recall. The study also contributes by evaluating the performance of Softmax and Random Forest individually. The macro-average F1 score is calculated to compare the performance of different transformer embeddings in the same experimental settings. The experiments reveal that MPNet versions v1 and v3 achieve the highest F1 score of 97.7% when combined with Random Forest, while T5 Large embedding achieves the highest F1 score of 98.2% when used with Softmax regression. MPNet v1 performs exceptionally well when used in the voting classifier, obtaining an impressive F1 score of 98.6%. In conclusion, the experiments validate the superiority of certain transformer models, such as MPNet v1, MPNet v3, and DistilRoBERTa, when used to calculate sentence embeddings within the Random Forest framework. The results also highlight the promising performance of T5 Large and RoBERTa Large in voting of Softmax regression and Random Forest. The voting classifier, employing transformer embeddings and ensemble learning techniques, consistently outperforms other baselines and individual algorithms. These findings emphasize the effectiveness of the voting classifier with transformer embeddings in achieving accurate and reliable predictions for news category classification tasks.
{"title":"Unifying Sentence Transformer Embedding and Softmax Voting Ensemble for Accurate News Category Prediction","authors":"Saima Khosa, A. Mehmood, Muhammad Rizwan","doi":"10.3390/computers12070137","DOIUrl":"https://doi.org/10.3390/computers12070137","url":null,"abstract":"The study focuses on news category prediction and investigates the performance of sentence embedding of four transformer models (BERT, RoBERTa, MPNet, and T5) and their variants as feature vectors when combined with Softmax and Random Forest using two accessible news datasets from Kaggle. The data are stratified into train and test sets to ensure equal representation of each category. Word embeddings are generated using transformer models, with the last hidden layer selected as the embedding. Mean pooling calculates a single vector representation called sentence embedding, capturing the overall meaning of the news article. The performance of Softmax and Random Forest, as well as the soft voting of both, is evaluated using evaluation measures such as accuracy, F1 score, precision, and recall. The study also contributes by evaluating the performance of Softmax and Random Forest individually. The macro-average F1 score is calculated to compare the performance of different transformer embeddings in the same experimental settings. The experiments reveal that MPNet versions v1 and v3 achieve the highest F1 score of 97.7% when combined with Random Forest, while T5 Large embedding achieves the highest F1 score of 98.2% when used with Softmax regression. MPNet v1 performs exceptionally well when used in the voting classifier, obtaining an impressive F1 score of 98.6%. In conclusion, the experiments validate the superiority of certain transformer models, such as MPNet v1, MPNet v3, and DistilRoBERTa, when used to calculate sentence embeddings within the Random Forest framework. The results also highlight the promising performance of T5 Large and RoBERTa Large in voting of Softmax regression and Random Forest. The voting classifier, employing transformer embeddings and ensemble learning techniques, consistently outperforms other baselines and individual algorithms. These findings emphasize the effectiveness of the voting classifier with transformer embeddings in achieving accurate and reliable predictions for news category classification tasks.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"17 1","pages":"137"},"PeriodicalIF":0.0,"publicationDate":"2023-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85638005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-07DOI: 10.3390/computation11070134
M. Sen
This paper investigates the asymptotic hyperstability of a single-input–single-output closed-loop system whose controlled plant is time-invariant and possesses a strongly strictly positive real transfer function that is subject to internal and external point delays. There are, in general, two controls involved, namely, the internal one that stabilizes the system with linear state feedback independent of the delay sizes and the external one that belongs to an hyperstable class and satisfies a Popov’s-type time integral inequality. Such a class of hyperstable controllers under consideration combines, in general, a regular impulse-free part with an impulsive part.
{"title":"Hyperstability of Linear Feed-Forward Time-Invariant Systems Subject to Internal and External Point Delays and Impulsive Nonlinear Time-Varying Feedback Controls","authors":"M. Sen","doi":"10.3390/computation11070134","DOIUrl":"https://doi.org/10.3390/computation11070134","url":null,"abstract":"This paper investigates the asymptotic hyperstability of a single-input–single-output closed-loop system whose controlled plant is time-invariant and possesses a strongly strictly positive real transfer function that is subject to internal and external point delays. There are, in general, two controls involved, namely, the internal one that stabilizes the system with linear state feedback independent of the delay sizes and the external one that belongs to an hyperstable class and satisfies a Popov’s-type time integral inequality. Such a class of hyperstable controllers under consideration combines, in general, a regular impulse-free part with an impulsive part.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"1 1","pages":"134"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78878142","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Path-specific effect analysis is a powerful tool in causal inference. This paper provides a definition of causal counterfactual path-specific importance score for the structural causal model (SCM). Different from existing path-specific effect definitions, which focus on the population level, the score defined in this paper can quantify the impact of a decision variable on an outcome variable along a specific pathway at the individual level. Moreover, the score has many desirable properties, including following the chain rule and being consistent. Finally, this paper presents an algorithm that can leverage these properties and find the k-most important paths with the highest importance scores in a causal graph effectively.
{"title":"Quantifying Causal Path-Specific Importance in Structural Causal Model","authors":"Xiaoxiao Wang, Minda Zhao, Fanyu Meng, Xin Liu, Z. Kong, Xin Chen","doi":"10.3390/computation11070133","DOIUrl":"https://doi.org/10.3390/computation11070133","url":null,"abstract":"Path-specific effect analysis is a powerful tool in causal inference. This paper provides a definition of causal counterfactual path-specific importance score for the structural causal model (SCM). Different from existing path-specific effect definitions, which focus on the population level, the score defined in this paper can quantify the impact of a decision variable on an outcome variable along a specific pathway at the individual level. Moreover, the score has many desirable properties, including following the chain rule and being consistent. Finally, this paper presents an algorithm that can leverage these properties and find the k-most important paths with the highest importance scores in a causal graph effectively.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"8 1","pages":"133"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79498191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-07DOI: 10.3390/computation11070135
Yu. Manzhos, Yevheniia Sokolova
With the proliferation of the Internet of Things devices and cyber-physical systems, there is a growing demand for highly functional and high-quality software. To address this demand, it is crucial to employ effective software verification methods. The proposed method is based on the use of physical quantities defined by the International System of Units, which have specific physical dimensions. Additionally, a transformation of the physical value orientation introduced by Siano is utilized. To evaluate the effectiveness of this method, specialized software defect models have been developed. These models are based on the statistical characteristics of the open-source C/C++ code used in drone applications. The advantages of the proposed method include early detection of software defects during compile-time, reduced testing duration, cost savings by identifying a significant portion of latent defects, improved software quality by enhancing reliability, robustness, and performance, as well as complementing existing verification techniques by focusing on latent defects based on software characteristics. By implementing this method, significant reductions in testing time and improvements in both reliability and software quality can be achieved. The method aims to detect 90% of incorrect uses of software variables and over 50% of incorrect uses of operations at both compile-time and run-time.
{"title":"A Software Verification Method for the Internet of Things and Cyber-Physical Systems","authors":"Yu. Manzhos, Yevheniia Sokolova","doi":"10.3390/computation11070135","DOIUrl":"https://doi.org/10.3390/computation11070135","url":null,"abstract":"With the proliferation of the Internet of Things devices and cyber-physical systems, there is a growing demand for highly functional and high-quality software. To address this demand, it is crucial to employ effective software verification methods. The proposed method is based on the use of physical quantities defined by the International System of Units, which have specific physical dimensions. Additionally, a transformation of the physical value orientation introduced by Siano is utilized. To evaluate the effectiveness of this method, specialized software defect models have been developed. These models are based on the statistical characteristics of the open-source C/C++ code used in drone applications. The advantages of the proposed method include early detection of software defects during compile-time, reduced testing duration, cost savings by identifying a significant portion of latent defects, improved software quality by enhancing reliability, robustness, and performance, as well as complementing existing verification techniques by focusing on latent defects based on software characteristics. By implementing this method, significant reductions in testing time and improvements in both reliability and software quality can be achieved. The method aims to detect 90% of incorrect uses of software variables and over 50% of incorrect uses of operations at both compile-time and run-time.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"137 1","pages":"135"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79746161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-06DOI: 10.3390/computers12070136
Yousra Odeh, Nedhal Al Saiyd
The prioritization of software requirements is necessary for successful software development. A use case is a useful approach to represent and prioritize user-centric requirements. Use-case-based prioritization is used to rank use cases to attain a business value based on identified criteria. The research community has started engaging use case modeling for emerging technologies such as the IoT, mobile development, and big data. A systematic literature review was conducted to understand the approaches reported in the last two decades. For each of the 40 identified approaches, a review is presented with respect to consideration of scenarios, the extent of formality, and the size of requirements. Only 32.5% of the reviewed studies considered scenario-based approaches, and the majority of reported approaches were semiformally developed (53.8%). The reported result opens prospects for the development of new approaches to fill a gap regarding the inclusive of strategic goals and respective business processes that support scenario representation. This study reveals that existing approaches fail to consider necessary criteria such as risks, goals, and some quality-related requirements. The findings reported herein are useful for researchers and practitioners aiming to improve current prioritization practices using the use case approach.
{"title":"Prioritizing Use Cases: A Systematic Literature Review","authors":"Yousra Odeh, Nedhal Al Saiyd","doi":"10.3390/computers12070136","DOIUrl":"https://doi.org/10.3390/computers12070136","url":null,"abstract":"The prioritization of software requirements is necessary for successful software development. A use case is a useful approach to represent and prioritize user-centric requirements. Use-case-based prioritization is used to rank use cases to attain a business value based on identified criteria. The research community has started engaging use case modeling for emerging technologies such as the IoT, mobile development, and big data. A systematic literature review was conducted to understand the approaches reported in the last two decades. For each of the 40 identified approaches, a review is presented with respect to consideration of scenarios, the extent of formality, and the size of requirements. Only 32.5% of the reviewed studies considered scenario-based approaches, and the majority of reported approaches were semiformally developed (53.8%). The reported result opens prospects for the development of new approaches to fill a gap regarding the inclusive of strategic goals and respective business processes that support scenario representation. This study reveals that existing approaches fail to consider necessary criteria such as risks, goals, and some quality-related requirements. The findings reported herein are useful for researchers and practitioners aiming to improve current prioritization practices using the use case approach.","PeriodicalId":10526,"journal":{"name":"Comput.","volume":"49 1","pages":"136"},"PeriodicalIF":0.0,"publicationDate":"2023-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82923733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}