Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991488
Fumiaki Ohata, K. Hirose, M. Fujii, Katsuro Inoue
Program slicing has been used for efficient program debugging activities. A program slice is computed by analyzing dependence relations between program statements. We can divide dependence analyses into two categories, static and dynamic; the former requires small analysis costs, but the resulting slices are large, and in the latter the cost is high but the slices are small. In this paper, we propose a program slicing method for object-oriented programs and evaluate its effectiveness with Java programs. Since object-oriented languages have many dynamically determined elements, static analysis could not compute practical analysis results. Our method uses static and dynamic analyses appropriately and computes accurate slices with small costs.
{"title":"A slicing method for object-oriented programs using lightweight dynamic information","authors":"Fumiaki Ohata, K. Hirose, M. Fujii, Katsuro Inoue","doi":"10.1109/APSEC.2001.991488","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991488","url":null,"abstract":"Program slicing has been used for efficient program debugging activities. A program slice is computed by analyzing dependence relations between program statements. We can divide dependence analyses into two categories, static and dynamic; the former requires small analysis costs, but the resulting slices are large, and in the latter the cost is high but the slices are small. In this paper, we propose a program slicing method for object-oriented programs and evaluate its effectiveness with Java programs. Since object-oriented languages have many dynamically determined elements, static analysis could not compute practical analysis results. Our method uses static and dynamic analyses appropriately and computes accurate slices with small costs.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132314075","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991485
Takako Nakatani, Tetsuya Urai, Sou Ohmura, T. Tamai
Engineers have little time for requirements elicitation and their validation, because they still have to make a great effort to write down concrete use cases. Although concrete use cases are important for deriving test cases, it is possible to free engineers from the routine work of defining similar use cases repeatedly and at the same time keeping consistency in requirements elicitation. We propose one solution concerning these difficulties. The requirements description metamodel called RD-metamodel integrates the activity graph metamodel and use case metamodel. It supplies a mechanism of use case writing with multiple perspectives: resource-reference, resource-structure, activity-sequence, process, and the actor's perspective.
{"title":"A requirements description metamodel for use cases","authors":"Takako Nakatani, Tetsuya Urai, Sou Ohmura, T. Tamai","doi":"10.1109/APSEC.2001.991485","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991485","url":null,"abstract":"Engineers have little time for requirements elicitation and their validation, because they still have to make a great effort to write down concrete use cases. Although concrete use cases are important for deriving test cases, it is possible to free engineers from the routine work of defining similar use cases repeatedly and at the same time keeping consistency in requirements elicitation. We propose one solution concerning these difficulties. The requirements description metamodel called RD-metamodel integrates the activity graph metamodel and use case metamodel. It supplies a mechanism of use case writing with multiple perspectives: resource-reference, resource-structure, activity-sequence, process, and the actor's perspective.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127237197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991460
J. J. Pardo, V. V. Ruiz, F. Cuartero, D. Cazorla
In this paper we consider a timed process algebra based on classical LOTOS operators in order to specify the behaviour of concurrent systems and, concretely, those systems for which time becomes an important factor to take into account, such as real-time systems. One of the main goals of this paper is to define a translation into a kind of dynamic state graph, which is currently supported by a tool (TPAL), which allows it to simulate the execution of a specification by means of these dynamic state graphs.
{"title":"Automatic translation of a timed process algebra into dynamic state graphs","authors":"J. J. Pardo, V. V. Ruiz, F. Cuartero, D. Cazorla","doi":"10.1109/APSEC.2001.991460","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991460","url":null,"abstract":"In this paper we consider a timed process algebra based on classical LOTOS operators in order to specify the behaviour of concurrent systems and, concretely, those systems for which time becomes an important factor to take into account, such as real-time systems. One of the main goals of this paper is to define a translation into a kind of dynamic state graph, which is currently supported by a tool (TPAL), which allows it to simulate the execution of a specification by means of these dynamic state graphs.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"134 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114210556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991501
Carroll Morgan, Annabelle McIver
Summary form only given. Recent work in probabilistic programming semantics has provided a relatively simple probabilistic extension to predicate transformers, making it possible to treat small imperative probabilistic programs containing both demonic and angelic nondeterminism. That work in turn has extended to provide a probabilistic basis for the modal /spl mu/-calculus of Kozen (1983), and leads to a quantitative /spl mu/-calculus. Standard (non-probabilistic) /spl mu/-calculus can be interpreted either 'normally', over its semantic domain, or as a two-player game between an 'angel' and a 'demon' representing the two forms of choice. Stirling (1995) has argued that the two interpretations correspond. Quantitative p-calculus too can be interpreted both ways, with the novel interpretation being the second: a probabilistic game involving an angel and a demon. Each player seeks a strategy to maximise (resp. minimise) the game's 'outcome', with the steps in the game now being stochastic. That suggests a connection with Markov decision processes, in which players compete for high (resp. low) 'rewards' over a Markov transition system. In this paper we explore that connection, showing how for example discounted Markov decision processes (MDP's) and terminating MDP's can be written as quantitative p-formulae. The 'normal' interpretation of those formulae (i.e. over the semantic domain) then seems to give a much more direct access to existence theorems than the presentation usually associated with MDP's. Our technical contribution is to explain the coding of MDP's as quantitative p-formulae, to discuss the extension of the latte in incorporate 'rewards', and to illustrate the resulting reformulation of several existence theorems. In an appendix we give an algebraic characterisation of the new quantitative-with-reward form of the calculus.
{"title":"Cost analysis of games, using program logic","authors":"Carroll Morgan, Annabelle McIver","doi":"10.1109/APSEC.2001.991501","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991501","url":null,"abstract":"Summary form only given. Recent work in probabilistic programming semantics has provided a relatively simple probabilistic extension to predicate transformers, making it possible to treat small imperative probabilistic programs containing both demonic and angelic nondeterminism. That work in turn has extended to provide a probabilistic basis for the modal /spl mu/-calculus of Kozen (1983), and leads to a quantitative /spl mu/-calculus. Standard (non-probabilistic) /spl mu/-calculus can be interpreted either 'normally', over its semantic domain, or as a two-player game between an 'angel' and a 'demon' representing the two forms of choice. Stirling (1995) has argued that the two interpretations correspond. Quantitative p-calculus too can be interpreted both ways, with the novel interpretation being the second: a probabilistic game involving an angel and a demon. Each player seeks a strategy to maximise (resp. minimise) the game's 'outcome', with the steps in the game now being stochastic. That suggests a connection with Markov decision processes, in which players compete for high (resp. low) 'rewards' over a Markov transition system. In this paper we explore that connection, showing how for example discounted Markov decision processes (MDP's) and terminating MDP's can be written as quantitative p-formulae. The 'normal' interpretation of those formulae (i.e. over the semantic domain) then seems to give a much more direct access to existence theorems than the presentation usually associated with MDP's. Our technical contribution is to explain the coding of MDP's as quantitative p-formulae, to discuss the extension of the latte in incorporate 'rewards', and to illustrate the resulting reformulation of several existence theorems. In an appendix we give an algebraic characterisation of the new quantitative-with-reward form of the calculus.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"407 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130705113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991468
T. Senivongse, Worawut Suphasanthitikul
A trading object or trader provides the ability to discover instances of services offered within a distributed system. This paper proposes a service discovery service (SDS) for a CORBA environment that can assist client providers in the construction of their client programs. By using the SDS, the client providers will obtain knowledge of services with required functionality before writing the client programs to trade for those services with the trader. The SDS gathers XML service descriptions from multiple CORBA traders and can discover information on service types, service interface definitions, and service offers. It can be thought of as a search engine for service descriptions and query is possible via keywords and XML query languages. It is also accessible by both CORBA and WWW clients with the possibility of enlarging search space by a federation of multiple SDS.
{"title":"Trading-assisting service discovery architecture","authors":"T. Senivongse, Worawut Suphasanthitikul","doi":"10.1109/APSEC.2001.991468","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991468","url":null,"abstract":"A trading object or trader provides the ability to discover instances of services offered within a distributed system. This paper proposes a service discovery service (SDS) for a CORBA environment that can assist client providers in the construction of their client programs. By using the SDS, the client providers will obtain knowledge of services with required functionality before writing the client programs to trade for those services with the trader. The SDS gathers XML service descriptions from multiple CORBA traders and can discover information on service types, service interface definitions, and service offers. It can be thought of as a search engine for service descriptions and query is possible via keywords and XML query languages. It is also accessible by both CORBA and WWW clients with the possibility of enlarging search space by a federation of multiple SDS.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121155473","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991466
T. Kishi, N. Noda, T. Katayama
It is important to design architecture to be steady throughout the evolution, or make the architecture to be the platform for a product family. In order to design software architecture to have such characteristics we have to analyze the commonality and differences among requirements on potential software that are supposed to be developed on the architecture, and then design software architecture so as to accommodate the commonality and differences. In this paper, we propose an approach for architectural design in which we analyze the requirements of potential software in terms of the impact on the architecture, considering multiple quality attributes. We perform a case study on an actual project that designed architecture for an on-board system for ITS systems to examine the usefulness of the technique. We also apply the technique to the same architectural design problem to demonstrate that it is applicable to the real problem.
{"title":"Architectural design for evolution by analyzing requirements on quality attributes","authors":"T. Kishi, N. Noda, T. Katayama","doi":"10.1109/APSEC.2001.991466","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991466","url":null,"abstract":"It is important to design architecture to be steady throughout the evolution, or make the architecture to be the platform for a product family. In order to design software architecture to have such characteristics we have to analyze the commonality and differences among requirements on potential software that are supposed to be developed on the architecture, and then design software architecture so as to accommodate the commonality and differences. In this paper, we propose an approach for architectural design in which we analyze the requirements of potential software in terms of the impact on the architecture, considering multiple quality attributes. We perform a case study on an actual project that designed architecture for an on-board system for ITS systems to examine the usefulness of the technique. We also apply the technique to the same architectural design problem to demonstrate that it is applicable to the real problem.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128561469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991479
V. Alagar, M. Zheng
Real-time reactive systems are complex systems to design and verify. Rigorous testing of real-time reactive systems complement the more difficult and expensive formal verification process. This paper discusses a rigorous method for block-box testing of real-time reactive systems, whose design specifications are given in the timed reactive object model (TROM) formalism.
{"title":"A rigorous method for testing real-time reactive systems","authors":"V. Alagar, M. Zheng","doi":"10.1109/APSEC.2001.991479","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991479","url":null,"abstract":"Real-time reactive systems are complex systems to design and verify. Rigorous testing of real-time reactive systems complement the more difficult and expensive formal verification process. This paper discusses a rigorous method for block-box testing of real-time reactive systems, whose design specifications are given in the timed reactive object model (TROM) formalism.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124671089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991509
Eunsook Cho, Min Sun Kim, Soo Dong Kim
Recently, component-based software development is getting accepted in industry as a new effective software development paradigm. Since the introduction of component-based software engineering (CBSE) in later 90's, the CBSD research has focused largely on component modeling, methodology, architecture and component platform. However, as the number of components available on the market increases, it becomes more important to devise metrics to quantify the various characteristics of components. In this paper, we propose metrics for measuring the complexity, customizability, and reusability of software components. Complexity metric can be used to evaluate the complexity of components. Customizability is used to measure how efficiently and widely the components can be customized for organization specific requirement. Reusability can be used to measure the degree of features that are reused in building applications. We expect that these metrics can be effectively used to quantify the characteristics of components.
{"title":"Component metrics to measure component quality","authors":"Eunsook Cho, Min Sun Kim, Soo Dong Kim","doi":"10.1109/APSEC.2001.991509","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991509","url":null,"abstract":"Recently, component-based software development is getting accepted in industry as a new effective software development paradigm. Since the introduction of component-based software engineering (CBSE) in later 90's, the CBSD research has focused largely on component modeling, methodology, architecture and component platform. However, as the number of components available on the market increases, it becomes more important to devise metrics to quantify the various characteristics of components. In this paper, we propose metrics for measuring the complexity, customizability, and reusability of software components. Complexity metric can be used to evaluate the complexity of components. Customizability is used to measure how efficiently and widely the components can be customized for organization specific requirement. Reusability can be used to measure the degree of features that are reused in building applications. We expect that these metrics can be effectively used to quantify the characteristics of components.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128295388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991477
Nahomi Kikuchi, T. Kikuno
This paper describes a test process improvement aiming to improve software quality in a large organization that has a large number of software projects. First, we identified activities in the testing process in the organization and analyzed their characteristics. As a result, we identified that dynamic tests have been performed well and static tests have been less performed. Improvement plan was requested that contributes to the product quality without increasing development efforts for the projects. We then decided a plan to introduce static analysis tools and establish the testing process in which static analysis tools are applied as much as possible. Implementation of the improvement plan consists of two steps: introductory and complete application of tools to the organization. The characteristics of this process improvement are not only that tools have been evaluated and confirmed in pilot projects before actual introduction but also procedures have been designed carefully for the application of the tools to projects. The effectiveness of this approach was confirmed in the analysis of applied projects in which static problems were removed successfully before system test.
{"title":"Improving the testing process by program static analysis","authors":"Nahomi Kikuchi, T. Kikuno","doi":"10.1109/APSEC.2001.991477","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991477","url":null,"abstract":"This paper describes a test process improvement aiming to improve software quality in a large organization that has a large number of software projects. First, we identified activities in the testing process in the organization and analyzed their characteristics. As a result, we identified that dynamic tests have been performed well and static tests have been less performed. Improvement plan was requested that contributes to the product quality without increasing development efforts for the projects. We then decided a plan to introduce static analysis tools and establish the testing process in which static analysis tools are applied as much as possible. Implementation of the improvement plan consists of two steps: introductory and complete application of tools to the organization. The characteristics of this process improvement are not only that tools have been evaluated and confirmed in pilot projects before actual introduction but also procedures have been designed carefully for the application of the tools to projects. The effectiveness of this approach was confirmed in the analysis of applied projects in which static problems were removed successfully before system test.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125356718","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-12-04DOI: 10.1109/APSEC.2001.991467
M. Patra, H. Mohanty
This paper presents a micro-level design perspective of a software agent that is capable of taking part in service-centric interactions with other agents in its environment. The agent software is not built as a monolithic entity but is engineered as a set of distinct modules. The components of the agent software are formally specified using the RAISE specification language. Our formal approach to agent building provides clarity at the conceptual level and eases the process of implementation.
{"title":"A formal framework to build software agents","authors":"M. Patra, H. Mohanty","doi":"10.1109/APSEC.2001.991467","DOIUrl":"https://doi.org/10.1109/APSEC.2001.991467","url":null,"abstract":"This paper presents a micro-level design perspective of a software agent that is capable of taking part in service-centric interactions with other agents in its environment. The agent software is not built as a monolithic entity but is engineered as a set of distinct modules. The components of the agent software are formally specified using the RAISE specification language. Our formal approach to agent building provides clarity at the conceptual level and eases the process of implementation.","PeriodicalId":130293,"journal":{"name":"Proceedings Eighth Asia-Pacific Software Engineering Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117092397","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}