Model-driven development (MDD) corresponds to the building of models and their transformation into intermediate models and code. Modeling components and compositions is a natural consequence of MDD. We show in this paper the advantages of using an executable modeling language associated with a Java library which pre-implements the execution semantics of this language. The proposed executable language is based on UML state machine diagrams. The semantic variation points linked to these diagrams lead us to manage equivalent variations in the Java implementation of components. The paper offers a comprehensive component design method based on a tailor-made UML profile whose role is the control of the semantic variation points in models.
{"title":"Component Design based on Model Executability","authors":"F. Barbier, Eric Cariou","doi":"10.1109/SEAA.2008.16","DOIUrl":"https://doi.org/10.1109/SEAA.2008.16","url":null,"abstract":"Model-driven development (MDD) corresponds to the building of models and their transformation into intermediate models and code. Modeling components and compositions is a natural consequence of MDD. We show in this paper the advantages of using an executable modeling language associated with a Java library which pre-implements the execution semantics of this language. The proposed executable language is based on UML state machine diagrams. The semantic variation points linked to these diagrams lead us to manage equivalent variations in the Java implementation of components. The paper offers a comprehensive component design method based on a tailor-made UML profile whose role is the control of the semantic variation points in models.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130520925","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The objective of the work presented in this paper is to design and develop a framework for simulation of requirements engineering processes. The framework is intended to be a support when simulation models are built by guiding the modeler in which components to use in this type of models and to speed up the process of developing simulation models. The framework was developed through an iterative process where it was used to model two processes that previously have been represented in simulation models. The resulting framework consists of three layers; one representing general process modeling concepts, one representing software process concepts, and one representing requirements engineering concepts.
{"title":"A Framework for Simulation of Requirements Engineering Processes","authors":"Martin Höst, B. Regnell, Christofer Tingström","doi":"10.1109/SEAA.2008.26","DOIUrl":"https://doi.org/10.1109/SEAA.2008.26","url":null,"abstract":"The objective of the work presented in this paper is to design and develop a framework for simulation of requirements engineering processes. The framework is intended to be a support when simulation models are built by guiding the modeler in which components to use in this type of models and to speed up the process of developing simulation models. The framework was developed through an iterative process where it was used to model two processes that previously have been represented in simulation models. The resulting framework consists of three layers; one representing general process modeling concepts, one representing software process concepts, and one representing requirements engineering concepts.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115643285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Early software size estimation is essential for good project management. Although several proposals to estimate software size from requirement specifications exist, most of them have not been properly defined or automated. This paper presents the design and automation of a measurement procedure (ReqPoints) to estimate the size of object-oriented software projects from a requirements specification. The procedure is based on a requirements engineering approach that provides a MDA framework for requirements specification and model transformations to obtain the architecture of UML models. Specifically, a set of measurement rules is defined as a mapping between the concepts of the Requirements Metamodel onto the concepts of the Function Point Analysis (FPA) Metamodel. A Requirements Estimation Tool (REST) was built to automate the measurement process. We demonstrate the feasibility of applying the estimation tool to a case study.
{"title":"A Metamodeling Approach to Estimate Software Size from Requirements Specifications","authors":"S. Abrahão, E. Insfrán","doi":"10.1109/SEAA.2008.53","DOIUrl":"https://doi.org/10.1109/SEAA.2008.53","url":null,"abstract":"Early software size estimation is essential for good project management. Although several proposals to estimate software size from requirement specifications exist, most of them have not been properly defined or automated. This paper presents the design and automation of a measurement procedure (ReqPoints) to estimate the size of object-oriented software projects from a requirements specification. The procedure is based on a requirements engineering approach that provides a MDA framework for requirements specification and model transformations to obtain the architecture of UML models. Specifically, a set of measurement rules is defined as a mapping between the concepts of the Requirements Metamodel onto the concepts of the Function Point Analysis (FPA) Metamodel. A Requirements Estimation Tool (REST) was built to automate the measurement process. We demonstrate the feasibility of applying the estimation tool to a case study.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115537884","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A crucial issue in the software cost estimation area that has attracted the interest of software project managers is the selection of the best prediction method for estimating the cost of a project. Most of the prediction techniques estimate the cost from historical data. The selection of the best model is based on accuracy measures that are functions of the predictive error, whereas the significance of the differences can be evaluated through statistical procedures. However, statistical tests cannot be applied easily by non-experts while there are difficulties in the interpretation of their results. The purpose of this paper is to introduce the utilization of a visualization tool, the regression error characteristic curves in order to compare different prediction models easily, by a simple inspection of a graph. Moreover, these curves are adjusted to accuracy measures appeared in software cost estimation literature and the experimentation is based on two well-known datasets.
{"title":"Comparing Software Cost Prediction Models by a Visualization Tool","authors":"N. Mittas, L. Angelis","doi":"10.1109/SEAA.2008.23","DOIUrl":"https://doi.org/10.1109/SEAA.2008.23","url":null,"abstract":"A crucial issue in the software cost estimation area that has attracted the interest of software project managers is the selection of the best prediction method for estimating the cost of a project. Most of the prediction techniques estimate the cost from historical data. The selection of the best model is based on accuracy measures that are functions of the predictive error, whereas the significance of the differences can be evaluated through statistical procedures. However, statistical tests cannot be applied easily by non-experts while there are difficulties in the interpretation of their results. The purpose of this paper is to introduce the utilization of a visualization tool, the regression error characteristic curves in order to compare different prediction models easily, by a simple inspection of a graph. Moreover, these curves are adjusted to accuracy measures appeared in software cost estimation literature and the experimentation is based on two well-known datasets.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114785734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a technical vision for future individual traffic. It deals with two different objectives: passenger cars or motorcycles as battery-driven electric vehicles (EVs) and traffic congestion avoidance. On the technical background of our own work we will explain how power supply for recharging the batteries will have to be organized in a distributed fashion, in particular under the assumption that the power is provided through renewable sources such as from wind turbines and solar panels (which are widely dispersed themselves). We will argue that while the unpredictability of local or regional customers in traditional power grid management creates already major problems for network stability (thus for providing the reserve energy needed) these will be greatly amplified by introducing EVs on a large scale, and by integrating renewable energy into the existing power management. In our DEZENT project we have defined and broadly pursued a distributed bottom-up approach for negotiating demand and supply under such circumstances, in an adequate architecture where demand and supply will be negotiated by software agents within 0.5 sec intervals while at the same time the grid stability is guaranteed. Since EVs themselves constitute relevant sources of reserve energy when coming up in large numbers they could be a core instrument for minimizing the stability problem.- Under this innovative perspective we will also discuss a novel distributed algorithm (BeeJamA) based on Swarm Intelligence where the EVs receive directions in due time, in a highly dynamic way before reaching each road intersection. In the absence of global information congestions are avoided and at the same time the travel times of all drivers are "homogenized". Combined with the transition into EV traffic we also could expect a very substantial reduction of pollution thus altogether an enormous ecological and economic progress.
{"title":"Distributed Embedded Real-Time Systems and Beyond: A Vision of Future Road Vehicle Management","authors":"H. Wedde, S. Lehnhoff, C. Rehtanz, O. Krause","doi":"10.1109/SEAA.2008.56","DOIUrl":"https://doi.org/10.1109/SEAA.2008.56","url":null,"abstract":"This paper presents a technical vision for future individual traffic. It deals with two different objectives: passenger cars or motorcycles as battery-driven electric vehicles (EVs) and traffic congestion avoidance. On the technical background of our own work we will explain how power supply for recharging the batteries will have to be organized in a distributed fashion, in particular under the assumption that the power is provided through renewable sources such as from wind turbines and solar panels (which are widely dispersed themselves). We will argue that while the unpredictability of local or regional customers in traditional power grid management creates already major problems for network stability (thus for providing the reserve energy needed) these will be greatly amplified by introducing EVs on a large scale, and by integrating renewable energy into the existing power management. In our DEZENT project we have defined and broadly pursued a distributed bottom-up approach for negotiating demand and supply under such circumstances, in an adequate architecture where demand and supply will be negotiated by software agents within 0.5 sec intervals while at the same time the grid stability is guaranteed. Since EVs themselves constitute relevant sources of reserve energy when coming up in large numbers they could be a core instrument for minimizing the stability problem.- Under this innovative perspective we will also discuss a novel distributed algorithm (BeeJamA) based on Swarm Intelligence where the EVs receive directions in due time, in a highly dynamic way before reaching each road intersection. In the absence of global information congestions are avoided and at the same time the travel times of all drivers are \"homogenized\". Combined with the transition into EV traffic we also could expect a very substantial reduction of pollution thus altogether an enormous ecological and economic progress.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126733651","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Bertolino, G. D. Angelis, F. Lonetti, A. Sabetta
There is a growing interest for techniques and tools facilitating the testing of mobile systems. The movement of nodes is one of the relevant factors of context change in ubiquitous systems and a key challenge in the validation of context-aware applications. An approach is proposed to generate a testbed for service-oriented systems that takes into account a mobility model of the nodes of the network in which the accessed services are deployed. This testbed allows a tester to assess off-line the QoS properties of a service under test, by considering possible variations in the response of the interacting services due to node mobility.
{"title":"Let The Puppets Move! Automated Testbed Generation for Service-oriented Mobile Applications","authors":"A. Bertolino, G. D. Angelis, F. Lonetti, A. Sabetta","doi":"10.1109/SEAA.2008.33","DOIUrl":"https://doi.org/10.1109/SEAA.2008.33","url":null,"abstract":"There is a growing interest for techniques and tools facilitating the testing of mobile systems. The movement of nodes is one of the relevant factors of context change in ubiquitous systems and a key challenge in the validation of context-aware applications. An approach is proposed to generate a testbed for service-oriented systems that takes into account a mobility model of the nodes of the network in which the accessed services are deployed. This testbed allows a tester to assess off-line the QoS properties of a service under test, by considering possible variations in the response of the interacting services due to node mobility.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"655 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123050370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Novel forms of collaboration increasingly distribute control among e-workers, thereby allowing agile and autonomous collaboration. However, this requires a novel blend of infrastructure and algorithms for self-adaptation of collaboration services. We present VieCAR (Vienna Collaborative Activity and Resource Management Framework), a framework that addresses the requirements of new collaborative service-oriented environments. Self-adaptive collaboration services depend on the user¿s context. VieCAR combines service-oriented architectures with activity-centric computing enabling people to interact and collaborate regardless of their location and across organizational boundaries. Based on VieCAR¿s activity model, we present a ranking algorithm determining the relevant input for service adaptation.
{"title":"VieCAR - Enabling Self-adaptive Collaboration Services","authors":"D. Schall, C. Dorn, S. Dustdar, Ignazio Dadduzio","doi":"10.1109/SEAA.2008.25","DOIUrl":"https://doi.org/10.1109/SEAA.2008.25","url":null,"abstract":"Novel forms of collaboration increasingly distribute control among e-workers, thereby allowing agile and autonomous collaboration. However, this requires a novel blend of infrastructure and algorithms for self-adaptation of collaboration services. We present VieCAR (Vienna Collaborative Activity and Resource Management Framework), a framework that addresses the requirements of new collaborative service-oriented environments. Self-adaptive collaboration services depend on the user¿s context. VieCAR combines service-oriented architectures with activity-centric computing enabling people to interact and collaborate regardless of their location and across organizational boundaries. Based on VieCAR¿s activity model, we present a ranking algorithm determining the relevant input for service adaptation.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130584784","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Optimizing IT investments in alignment with key business drivers is a challenge for all organizations. Organizations need to prioritize and select a portfolio of IT projects that provides optimal benefit and balances utility, risk, and resources. This paper reports on an exploratory empirical study into the processes and project-selection criteria used by several large companies. The objective of this research is to identify successful practices as well as some common challenges that companies have in performing IT portfolio management. These empirical findings provide insight into the current state of practice and common challenges. These form the basis for guidelines for implementing and improving IT portfolio processes.
{"title":"An Empirical Study into the State of Practice and Challenges in IT Project Portfolio Management","authors":"Egon Gleisberg, Hendrik Zondag, M. Chaudron","doi":"10.1109/SEAA.2008.45","DOIUrl":"https://doi.org/10.1109/SEAA.2008.45","url":null,"abstract":"Optimizing IT investments in alignment with key business drivers is a challenge for all organizations. Organizations need to prioritize and select a portfolio of IT projects that provides optimal benefit and balances utility, risk, and resources. This paper reports on an exploratory empirical study into the processes and project-selection criteria used by several large companies. The objective of this research is to identify successful practices as well as some common challenges that companies have in performing IT portfolio management. These empirical findings provide insight into the current state of practice and common challenges. These form the basis for guidelines for implementing and improving IT portfolio processes.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132937234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Defect causal analysis (DCA) is a means of product focused software process improvement. A systematic literature review to identify the DCA state of the art has been undertaken. The systematic review gathered unbiased knowledge and evidence and identified opportunities for further investigation. Moreover, some guidance on how to efficiently implement DCA in software organizations could be elaborated. This paper describes the initial concept of the DBPI (Defect Based Process Improvement) approach. It represents a DCA based approach for process improvement, designed considering the results of the systematic review and the obtained guidance. Its main contributions are tailoring support for DCA based process improvement and addressing an identified opportunity for further investigation by integrating organizational learning mechanisms regarding cause-effect relations into the conduct of DCA.
{"title":"Towards a Defect Prevention Based Process Improvement Approach","authors":"Marcos Kalinowski, G. Travassos, D. Card","doi":"10.1109/SEAA.2008.47","DOIUrl":"https://doi.org/10.1109/SEAA.2008.47","url":null,"abstract":"Defect causal analysis (DCA) is a means of product focused software process improvement. A systematic literature review to identify the DCA state of the art has been undertaken. The systematic review gathered unbiased knowledge and evidence and identified opportunities for further investigation. Moreover, some guidance on how to efficiently implement DCA in software organizations could be elaborated. This paper describes the initial concept of the DBPI (Defect Based Process Improvement) approach. It represents a DCA based approach for process improvement, designed considering the results of the systematic review and the obtained guidance. Its main contributions are tailoring support for DCA based process improvement and addressing an identified opportunity for further investigation by integrating organizational learning mechanisms regarding cause-effect relations into the conduct of DCA.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130709434","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Girolami, Stefano Lenzi, Francesco Furfari, S. Chessa
Wireless Sensor Networks (WSN) are an important technological support for ambient assisted applications. Up to now most WSN applications are based on ad hoc solutions, and attempts to provide reusable applications are still in their youth. Under this trend of research we propose a layered architecture called SAIL (Sensor Abstraction and Integration Layers) to be used in context aware architectures, and aimed at integrating WSN as context information sources. In the proposed approach, the applications running on WSN can be exposed using either a node centric or a data centric paradigm, and they are interfaced with different access technologies. SAIL is currently encapsulated within the OSGi framework and has been tested on MICAz motes.
无线传感器网络(WSN)是环境辅助应用的重要技术支撑。到目前为止,大多数WSN应用程序都是基于临时解决方案的,而提供可重用应用程序的尝试仍处于起步阶段。在这一研究趋势下,我们提出了一种分层架构SAIL (Sensor Abstraction and Integration Layers,传感器抽象与集成层)用于上下文感知架构,旨在将WSN集成为上下文信息源。在提出的方法中,运行在WSN上的应用程序可以使用以节点为中心或以数据为中心的范式进行公开,并且它们使用不同的访问技术进行接口。SAIL目前封装在OSGi框架中,并已在MICAz上进行了测试。
{"title":"SAIL: A Sensor Abstraction and Integration Layer for Context Awareness","authors":"M. Girolami, Stefano Lenzi, Francesco Furfari, S. Chessa","doi":"10.1109/SEAA.2008.30","DOIUrl":"https://doi.org/10.1109/SEAA.2008.30","url":null,"abstract":"Wireless Sensor Networks (WSN) are an important technological support for ambient assisted applications. Up to now most WSN applications are based on ad hoc solutions, and attempts to provide reusable applications are still in their youth. Under this trend of research we propose a layered architecture called SAIL (Sensor Abstraction and Integration Layers) to be used in context aware architectures, and aimed at integrating WSN as context information sources. In the proposed approach, the applications running on WSN can be exposed using either a node centric or a data centric paradigm, and they are interfaced with different access technologies. SAIL is currently encapsulated within the OSGi framework and has been tested on MICAz motes.","PeriodicalId":127633,"journal":{"name":"2008 34th Euromicro Conference Software Engineering and Advanced Applications","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125264018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}