Use case scenarios are known as powerful means for requirements specification. On the one hand, they join in the same modelling space the expectations of the stakeholders and the needs of the developers involved in the process. On the other hand, they describe the desired high level functionalities. By formalizing these descriptions we are able to extract relevant information's from them. Specifically, we are interested in identifying requirements patterns (common requirements with typical implementation solutions) in support for a requirements based software development approach. This paper addresses the transformation of use case descriptions expressed in a Controller Natural Language into an ontology expressed in the Web Ontology Language (OWL), as well as the query process for such information. It reports on a study aimed at validating our approach and our tool with real users. A preliminary set of results is discussed.
{"title":"A Study on the Viability of Formalizing Use Cases","authors":"Rui Couto, A. Ribeiro, J. C. Campos","doi":"10.1109/QUATIC.2014.23","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.23","url":null,"abstract":"Use case scenarios are known as powerful means for requirements specification. On the one hand, they join in the same modelling space the expectations of the stakeholders and the needs of the developers involved in the process. On the other hand, they describe the desired high level functionalities. By formalizing these descriptions we are able to extract relevant information's from them. Specifically, we are interested in identifying requirements patterns (common requirements with typical implementation solutions) in support for a requirements based software development approach. This paper addresses the transformation of use case descriptions expressed in a Controller Natural Language into an ontology expressed in the Web Ontology Language (OWL), as well as the query process for such information. It reports on a study aimed at validating our approach and our tool with real users. A preliminary set of results is discussed.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128837789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The topic of alignment between Business and Information Systems has been for some time, and remains actually, a top concern of research in diverse areas. It presents many open roads for research, even if leading to much dispersion and fuzziness due to the different views and subjects involved. On the path to build a conceptual, structured framework that incorporates the related topics of requirements engineering, enterprise architecture and strategy alignment, our literature search uncovered three main themes: the business model artifact, strategy and goal modelling, and enterprise modelling. Following, the research problem and its related research questions, which answers will give origin to artifacts, through design science (working on existing knowledge) and action research (working in live projects), were laid out. Initial steps in our incremental research approach, with a perspective on the requirements engineering and business model topics, and the development of a new method around an existing solution for the generation of logical architectures, have been well received by the research community. Current and future work in our plan will deepen and extend our research, further grounding the business model artifact and the strategy and goal modelling issues, while approaching the enterprise modelling, framework structuring and tools support.
{"title":"An OMG-based Meta-Framework for Alignment of IS/IT Architecture with Business Models","authors":"Carlos E. Salgado, R. J. Machado, R. Maciel","doi":"10.1109/QUATIC.2014.46","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.46","url":null,"abstract":"The topic of alignment between Business and Information Systems has been for some time, and remains actually, a top concern of research in diverse areas. It presents many open roads for research, even if leading to much dispersion and fuzziness due to the different views and subjects involved. On the path to build a conceptual, structured framework that incorporates the related topics of requirements engineering, enterprise architecture and strategy alignment, our literature search uncovered three main themes: the business model artifact, strategy and goal modelling, and enterprise modelling. Following, the research problem and its related research questions, which answers will give origin to artifacts, through design science (working on existing knowledge) and action research (working in live projects), were laid out. Initial steps in our incremental research approach, with a perspective on the requirements engineering and business model topics, and the development of a new method around an existing solution for the generation of logical architectures, have been well received by the research community. Current and future work in our plan will deepen and extend our research, further grounding the business model artifact and the strategy and goal modelling issues, while approaching the enterprise modelling, framework structuring and tools support.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114610472","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Evidence-based approaches are expected to play an important role on increasing the quality awareness of our community as a whole, by raising evidence on what works, when and where. One of its most important instruments is the systematic literature review (SLR), a secondary study technique derived from medicine practice, which seeks to obtain accurate data by analyzing primary studies, eliminating possible bias that these studies may suffer. Mapping studies and quasi-systematic literature reviews are a kind of SLR aimed at identifying breaches in the corresponding set of primary studies, where further primary studies are required, as well as clumps that can be the target of more focused SLRs. However, we cannot forget that the building pieces of evidence-based approaches are primary studies, which can range from observational studies to controlled experiments, either applying quantitative or qualitative techniques.
{"title":"Foreword of the Thematic Track: Evidence-Based Software Quality Engineering","authors":"G. Travassos, Fernando Brito e Abreu","doi":"10.1109/QUATIC.2014.59","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.59","url":null,"abstract":"Evidence-based approaches are expected to play an important role on increasing the quality awareness of our community as a whole, by raising evidence on what works, when and where. One of its most important instruments is the systematic literature review (SLR), a secondary study technique derived from medicine practice, which seeks to obtain accurate data by analyzing primary studies, eliminating possible bias that these studies may suffer. Mapping studies and quasi-systematic literature reviews are a kind of SLR aimed at identifying breaches in the corresponding set of primary studies, where further primary studies are required, as well as clumps that can be the target of more focused SLRs. However, we cannot forget that the building pieces of evidence-based approaches are primary studies, which can range from observational studies to controlled experiments, either applying quantitative or qualitative techniques.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134197484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The quantitative assessment of quality attributes on software architectures allow to support early decisions in the design phase, certify quality requirements established by stakeholders and improve software quality in future architectural changes. In literature, only few of these quality requirements are verified and most often they are manually checked, which is time-consuming and error-prone due to the overwhelmingly complex designs. The goal of this thesis is to provide means for architects predict and analyze availability constraints on software architectures. We plan to generate a stochastic model from an architectural description specified by an Architecture Description Language (ADL) properly annotated to be solved by a probabilistic model-checking tool. This model will allow to quantitatively predict availability and identify bottlenecks that are negatively influencing the overall system availability. Hence, our approach will help architects to avoid undesired or infeasible architectural designs and prevent extra costs in fixing late life-cycle detected problems.
{"title":"Availability Evaluation of Software Architectures through Formal Methods","authors":"J. M. Franco, R. Barbosa, M. Z. Rela","doi":"10.1109/QUATIC.2014.45","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.45","url":null,"abstract":"The quantitative assessment of quality attributes on software architectures allow to support early decisions in the design phase, certify quality requirements established by stakeholders and improve software quality in future architectural changes. In literature, only few of these quality requirements are verified and most often they are manually checked, which is time-consuming and error-prone due to the overwhelmingly complex designs. The goal of this thesis is to provide means for architects predict and analyze availability constraints on software architectures. We plan to generate a stochastic model from an architectural description specified by an Architecture Description Language (ADL) properly annotated to be solved by a probabilistic model-checking tool. This model will allow to quantitatively predict availability and identify bottlenecks that are negatively influencing the overall system availability. Hence, our approach will help architects to avoid undesired or infeasible architectural designs and prevent extra costs in fixing late life-cycle detected problems.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125228283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
HE World Wide Web has become a major delivery platform for a variety of complex and sophisticated enterprise applications in several domains. In addition to their inherent multifaceted functionality, Web applications exhibit complex behaviour and place some unique requirements on their ubiquitous usability, performance, security and ability to grow and evolve. Web development can benefit from established practices from other related disciplines, but it has some distinguishing characteristics that are mainly due to its inherent multidisciplinary, encompassing contributions from diverse areas. In addition, due to the advent of new technologies, novel development practices have become possible, for example based on the paradigm of Web Mashups, also offering the possibility of defining advanced user interface mechanisms enhancing the user experience.
{"title":"Foreword of the Thematic Track: Quality in Web Engineering","authors":"M. Matera","doi":"10.1109/QUATIC.2014.61","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.61","url":null,"abstract":"HE World Wide Web has become a major delivery platform for a variety of complex and sophisticated enterprise applications in several domains. In addition to their inherent multifaceted functionality, Web applications exhibit complex behaviour and place some unique requirements on their ubiquitous usability, performance, security and ability to grow and evolve. Web development can benefit from established practices from other related disciplines, but it has some distinguishing characteristics that are mainly due to its inherent multidisciplinary, encompassing contributions from diverse areas. In addition, due to the advent of new technologies, novel development practices have become possible, for example based on the paradigm of Web Mashups, also offering the possibility of defining advanced user interface mechanisms enhancing the user experience.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"176 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120990039","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fabio Silva da Conceicao, Alan Silva, A. O. Filho, Reinaldo Cabral Silva Filho
Among the different kinds of incident processing centers, the service desk has highlighted with the increasing number of its adopters around the world. Nonetheless, even though it has a series of desirable features and resources, a service desk needs several procedures to survive, without which it can severely suffer with inadequate or inexistent IT service management. In order to overcome this need, the framework of best IT practices, mostly known as ITIL, has been widely implemented on such scenarios. However, its adherence by service desk operators, has not always been as friendly as it should. The main reason for this can be made evident when considering how hard is to comply with so many recommendations laid out on such framework. Additionally, the technical staff attention and worry on complying with that framework is so great that incident treatment quality can be impaired. It is exactly on that challenge, where the persuasive technique named gamification arises here as a proposal to turn hard, tense or tedious tasks, that are commonly performed on this way on service desks, into more interesting and engaging activities in a daily basis. Through game design and elements, gamification can improve the service desk operators' motivation and engaging, by making them more involved in their job and consequentely optimizing the use and adherence of ITIL recommended IT best practices. In this scenario, since the more IT best practices are adequately followed and the more incident treatment quality increases, obviously the more IT service management quality would be improved.
{"title":"Toward a Gamification Model to Improve IT Service Management Quality on Service Desk","authors":"Fabio Silva da Conceicao, Alan Silva, A. O. Filho, Reinaldo Cabral Silva Filho","doi":"10.1109/QUATIC.2014.41","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.41","url":null,"abstract":"Among the different kinds of incident processing centers, the service desk has highlighted with the increasing number of its adopters around the world. Nonetheless, even though it has a series of desirable features and resources, a service desk needs several procedures to survive, without which it can severely suffer with inadequate or inexistent IT service management. In order to overcome this need, the framework of best IT practices, mostly known as ITIL, has been widely implemented on such scenarios. However, its adherence by service desk operators, has not always been as friendly as it should. The main reason for this can be made evident when considering how hard is to comply with so many recommendations laid out on such framework. Additionally, the technical staff attention and worry on complying with that framework is so great that incident treatment quality can be impaired. It is exactly on that challenge, where the persuasive technique named gamification arises here as a proposal to turn hard, tense or tedious tasks, that are commonly performed on this way on service desks, into more interesting and engaging activities in a daily basis. Through game design and elements, gamification can improve the service desk operators' motivation and engaging, by making them more involved in their job and consequentely optimizing the use and adherence of ITIL recommended IT best practices. In this scenario, since the more IT best practices are adequately followed and the more incident treatment quality increases, obviously the more IT service management quality would be improved.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127104706","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Luis Alexandre Ferreira da Silva, Fernando Brito e Abreu
Context: There is an increasing demand for mobile BIS apps and shorter time-to-market requirements. However, developing those apps faces several problems, such as being able to guarantee business rules fulfilment, support multiple platforms, handle localization and facilitate apps evolution. Objective: Propose a generative approach for mobile BIS apps that will mitigate the identified problems. Its input is a platform independent model (PIM), with business rules specified in OCL. Method: We adopted the Design Science Research methodology, that helps gaining problem understanding, identifying systemically appropriate solutions, and in effectively evaluating new and innovative solutions. Results: We have already identified the problem and its motivation, defined the objectives for a solution, designed and developed a prototype generative tool for BIS apps, demonstrated its usage and evaluated how well it mitigates a subset of the identified problems in an observational study and started to communicate its effectiveness to researchers and practitioners. Limitations: Several issues have not been addressed yet, such as the problem of distributed business rules enforcement and the formalization of the required transformations from the PIM to several platform-specific models (PSMs). Conclusion: We intend to contribute for reducing BIS apps time-to-market, while improving the maintainability of those apps.
{"title":"A MDE Generative Approach for Mobile Business Apps","authors":"Luis Alexandre Ferreira da Silva, Fernando Brito e Abreu","doi":"10.1109/QUATIC.2014.50","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.50","url":null,"abstract":"Context: There is an increasing demand for mobile BIS apps and shorter time-to-market requirements. However, developing those apps faces several problems, such as being able to guarantee business rules fulfilment, support multiple platforms, handle localization and facilitate apps evolution. Objective: Propose a generative approach for mobile BIS apps that will mitigate the identified problems. Its input is a platform independent model (PIM), with business rules specified in OCL. Method: We adopted the Design Science Research methodology, that helps gaining problem understanding, identifying systemically appropriate solutions, and in effectively evaluating new and innovative solutions. Results: We have already identified the problem and its motivation, defined the objectives for a solution, designed and developed a prototype generative tool for BIS apps, demonstrated its usage and evaluated how well it mitigates a subset of the identified problems in an observational study and started to communicate its effectiveness to researchers and practitioners. Limitations: Several issues have not been addressed yet, such as the problem of distributed business rules enforcement and the formalization of the required transformations from the PIM to several platform-specific models (PSMs). Conclusion: We intend to contribute for reducing BIS apps time-to-market, while improving the maintainability of those apps.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133974325","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. M. Alves, Clenio F. Salviano, G. Stefanuto, Sonia T. Maintinguer, Carolina V. Mattos, Camila Zeitoum, Márcia Regina Martins Martinez, Giancarlo Reuss
Technological development and innovation are key drives for software development organizations. Furthermore they are strategic for the growth of a region or a Country. Therefore, the Brazilian government established a public policy instrument to identify and stimulate software resulting from technological development and innovation in the Country. Hence a software assessment methodology, named as CERTICS, has been created and established in Brazil. Its construction has been based on the reality of software development organizations and guided by methodological references including the ISO/IEC 15504 (SPICE) Standard. CERTICS includes a reference model, an assessment method and an arrangement for its application, monitoring and continuous improvement. A software organization can also benefit from CERTICS as good practices reference model on technology development and innovation. This article presents an overview of the rationality, design, major components and early practical results of CERTICS Methodology version 1.1.
{"title":"CERTICS Assessment Methodology for Software Technological Development and Innovation","authors":"A. M. Alves, Clenio F. Salviano, G. Stefanuto, Sonia T. Maintinguer, Carolina V. Mattos, Camila Zeitoum, Márcia Regina Martins Martinez, Giancarlo Reuss","doi":"10.1109/QUATIC.2014.32","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.32","url":null,"abstract":"Technological development and innovation are key drives for software development organizations. Furthermore they are strategic for the growth of a region or a Country. Therefore, the Brazilian government established a public policy instrument to identify and stimulate software resulting from technological development and innovation in the Country. Hence a software assessment methodology, named as CERTICS, has been created and established in Brazil. Its construction has been based on the reality of software development organizations and guided by methodological references including the ISO/IEC 15504 (SPICE) Standard. CERTICS includes a reference model, an assessment method and an arrangement for its application, monitoring and continuous improvement. A software organization can also benefit from CERTICS as good practices reference model on technology development and innovation. This article presents an overview of the rationality, design, major components and early practical results of CERTICS Methodology version 1.1.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"111 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122970762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Many nuclear instrumentation and control (I&C) systems are designed using a function block diagram description of the system. Strict requirements pertain to the verification of these systems. Different verification techniques, including structure-based testing, are demanded by standards and the regulators. Unfortunately, the traditional structure-based test techniques intended for software code are not directly applicable to function block diagrams. However, coverage criteria for function block diagrams have recently been developed. In this work we have used these coverage criteria and developed a technique for generating structure-based test sets for function block based designs. The test set is automatically generated but the technique requires that a model checking model of the system is available. The technique utilises model checking to determine the concrete test cases. We have also described how tests can be generated so that multiple test requirements can be fulfilled at once, thus decreasing the number of generated test cases. We have implemented our approach as a proof-of-concept tool, and demonstrated the technique on a case study system.
{"title":"Automatic Test Set Generation for Function Block Based Systems Using Model Checking","authors":"Jussi Lahtinen","doi":"10.1109/QUATIC.2014.15","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.15","url":null,"abstract":"Many nuclear instrumentation and control (I&C) systems are designed using a function block diagram description of the system. Strict requirements pertain to the verification of these systems. Different verification techniques, including structure-based testing, are demanded by standards and the regulators. Unfortunately, the traditional structure-based test techniques intended for software code are not directly applicable to function block diagrams. However, coverage criteria for function block diagrams have recently been developed. In this work we have used these coverage criteria and developed a technique for generating structure-based test sets for function block based designs. The test set is automatically generated but the technique requires that a model checking model of the system is available. The technique utilises model checking to determine the concrete test cases. We have also described how tests can be generated so that multiple test requirements can be fulfilled at once, thus decreasing the number of generated test cases. We have implemented our approach as a proof-of-concept tool, and demonstrated the technique on a case study system.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125263768","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
An interlocking system monitors the status of the objects in a railway yard, allowing or denying the movement of trains, in accordance with safety rules. These rules depend on the topology of the station and hence every single delivered system obeys a particular set of rules. On the other hand, being safety critical systems, interlocking are subject to expensive certification processes. Part of these costs are due to the fact that testing has to be repeated for each delivered product, moreover, due to the complexity of such topologies, the test suites may be very large, and different for each product. In this paper we show how the problem has been addressed at the final validation stage of production interlocking systems, by extracting a model of the implemented interlocking logic from the on-target description of the topology. This model is exercised with the planned test suite. Since simulation appears to be more than an order of magnitude faster than testing the target, early discovery of bugs in the description of rules or of inaccuracies in the test suite can spare hours of rework on the target.
{"title":"Validation of Interlocking Systems by Testing their Models","authors":"A. Bonacchi, A. Fantechi","doi":"10.1109/QUATIC.2014.37","DOIUrl":"https://doi.org/10.1109/QUATIC.2014.37","url":null,"abstract":"An interlocking system monitors the status of the objects in a railway yard, allowing or denying the movement of trains, in accordance with safety rules. These rules depend on the topology of the station and hence every single delivered system obeys a particular set of rules. On the other hand, being safety critical systems, interlocking are subject to expensive certification processes. Part of these costs are due to the fact that testing has to be repeated for each delivered product, moreover, due to the complexity of such topologies, the test suites may be very large, and different for each product. In this paper we show how the problem has been addressed at the final validation stage of production interlocking systems, by extracting a model of the implemented interlocking logic from the on-target description of the topology. This model is exercised with the planned test suite. Since simulation appears to be more than an order of magnitude faster than testing the target, early discovery of bugs in the description of rules or of inaccuracies in the test suite can spare hours of rework on the target.","PeriodicalId":317037,"journal":{"name":"2014 9th International Conference on the Quality of Information and Communications Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124885103","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}