In peer-to-peer (P2P) overlay networks, a group of multiple peers have to cooperate with each other. P2P systems are in nature scalable distributed systems, where there is no centralized coordinator. It is difficult, maybe impossible for each peer to communicate with every other peer in P2P overlay networks. An acquaintance peer of a peer is another peer with which the peer can directly communicate. Each peer has to obtain access and location information on resources like databases through communicating with acquaintance peers. It is critical to discuss how each peer can trust an acquaintance peer in P2P networks since acquaintance peers may have obsolete information and may be faulty. In this paper, we discuss subjective and objective types of trustworthiness of a peer on an acquaintance peer. A peer obtains the subjective trustworthiness on an acquaintance peer through directly communicating with the acquaintance peer. On the other hand, a peer obtains the objective trustworthiness on a target acquaintance peer through collecting subjective trustworthiness of other peers on the target acquaintance peer. In this paper, the trustworthiness is given based on the Fuzzy logics. There are cases the subjective and objective types of trustworthiness on an acquaintance peer are different. That is, other peers have different trustworthiness opinions on the target acquaintance peer. A peer decides on which type of trustworthiness to be taken based on the confidence. The confidence of a peer shows how much the peer is confident of its own trustworthiness opinion, i.e. subjective trustworthiness on the acquaintance peer. If a peer is confident of the trustworthiness opinion, i.e. the confidence is larger, the peer takes the subjective trustworthiness on the target acquaintance peer. Otherwise, the peer takes the objective trustworthiness.
{"title":"Trustworthiness-based Group Communication Protocols","authors":"K. Kouno, A. Aikebaier, T. Enokido, M. Takizawa","doi":"10.1109/NBiS.2014.52","DOIUrl":"https://doi.org/10.1109/NBiS.2014.52","url":null,"abstract":"In peer-to-peer (P2P) overlay networks, a group of multiple peers have to cooperate with each other. P2P systems are in nature scalable distributed systems, where there is no centralized coordinator. It is difficult, maybe impossible for each peer to communicate with every other peer in P2P overlay networks. An acquaintance peer of a peer is another peer with which the peer can directly communicate. Each peer has to obtain access and location information on resources like databases through communicating with acquaintance peers. It is critical to discuss how each peer can trust an acquaintance peer in P2P networks since acquaintance peers may have obsolete information and may be faulty. In this paper, we discuss subjective and objective types of trustworthiness of a peer on an acquaintance peer. A peer obtains the subjective trustworthiness on an acquaintance peer through directly communicating with the acquaintance peer. On the other hand, a peer obtains the objective trustworthiness on a target acquaintance peer through collecting subjective trustworthiness of other peers on the target acquaintance peer. In this paper, the trustworthiness is given based on the Fuzzy logics. There are cases the subjective and objective types of trustworthiness on an acquaintance peer are different. That is, other peers have different trustworthiness opinions on the target acquaintance peer. A peer decides on which type of trustworthiness to be taken based on the confidence. The confidence of a peer shows how much the peer is confident of its own trustworthiness opinion, i.e. subjective trustworthiness on the acquaintance peer. If a peer is confident of the trustworthiness opinion, i.e. the confidence is larger, the peer takes the subjective trustworthiness on the target acquaintance peer. Otherwise, the peer takes the objective trustworthiness.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125566584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Reasoning is one of the essential application areas of the modern Semantic Web. Nowadays, the semantic reasoning algorithms are facing significant challenges when dealing with the emergence of the Internet-scale knowledge bases, comprising extremely large amounts of data. The traditional reasoning approaches have only been approved for small, closed, trustworthy, consistent, coherent and static data domains. As such, they are not well-suited to be applied in data-intensive applications aiming on the Internet scale. We introduce the Large Knowledge Collider as a platform solution that leverages the service-oriented approach to implement a new reasoning technique, capable of dealing with exploding volumes of the rapidly growing data universe, in order to be able to take advantages of the large-scale and on-demand elastic infrastructures such as high performance computing or cloud technology.
{"title":"Where SOA Meets the Semantic Reasoning","authors":"A. Cheptsov, S. Wesner","doi":"10.1109/CISIS.2012.103","DOIUrl":"https://doi.org/10.1109/CISIS.2012.103","url":null,"abstract":"Reasoning is one of the essential application areas of the modern Semantic Web. Nowadays, the semantic reasoning algorithms are facing significant challenges when dealing with the emergence of the Internet-scale knowledge bases, comprising extremely large amounts of data. The traditional reasoning approaches have only been approved for small, closed, trustworthy, consistent, coherent and static data domains. As such, they are not well-suited to be applied in data-intensive applications aiming on the Internet scale. We introduce the Large Knowledge Collider as a platform solution that leverages the service-oriented approach to implement a new reasoning technique, capable of dealing with exploding volumes of the rapidly growing data universe, in order to be able to take advantages of the large-scale and on-demand elastic infrastructures such as high performance computing or cloud technology.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115432400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper shows an innovative Wireless Sensor Network (WSN) architecture based on a flexible Quality of Service (QoS) approach for road traffic management. The aim of this work is to analyse an algorithm that dynamically enables/disables some cameras according to the real need to monitor a given area, It is based on traffic volume measured values. This approach has been developed in order to manage both network topology and workload conditions, using a fuzzy logic controller for a flexible QoS management. Performances of this approach have been evaluated using True Time and Simulink/Matlab.
{"title":"A Novel Road Monitoring Approach Using Wireless Sensor Networks","authors":"M. Collotta, G. Pau, V. M. Salerno, G. Scatà","doi":"10.1109/CISIS.2012.37","DOIUrl":"https://doi.org/10.1109/CISIS.2012.37","url":null,"abstract":"This paper shows an innovative Wireless Sensor Network (WSN) architecture based on a flexible Quality of Service (QoS) approach for road traffic management. The aim of this work is to analyse an algorithm that dynamically enables/disables some cameras according to the real need to monitor a given area, It is based on traffic volume measured values. This approach has been developed in order to manage both network topology and workload conditions, using a fuzzy logic controller for a flexible QoS management. Performances of this approach have been evaluated using True Time and Simulink/Matlab.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116627539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Al-Smadi, G. Wesiak, C. Gütl, Andreas Holzinger
In the so-called 'New Culture for Assessment' assessment has become a tool for Learning. Assessment is no more considered to be isolated from the learning process and provided as embedded assessment forms. Nevertheless, students have more responsibility in the learning process in general and in assessment activities in particular. They become more engaged in: developing assessment criteria, participating in self, peer-assessments, reflecting on their own learning, monitoring their performance, and utilizing feedback to adapt their knowledge, skills, and behavior. Consequently, assessment tools have emerged from being stand-alone represented by monolithic systems through modular assessment tools to more flexible and interoperable generation by adopting the service-oriented architecture and modern learning specifications and standards. The new generation holds great promise when it comes to having interoperable learning services and tools within more personalized and adaptive e-learning platforms. In this paper, integrated automated assessment forms provided through flexible and SOA-based tools are discussed. Moreover, it presents a show case of how these forms have been integrated with a Complex Learning Resource (CLR) and used for self-directed learning. The results of the study show, that the developed tool for self-directed learning supports students in their learning process.
{"title":"Assessment for/as Learning: Integrated Automatic Assessment in Complex Learning Resources for Self-Directed Learning","authors":"M. Al-Smadi, G. Wesiak, C. Gütl, Andreas Holzinger","doi":"10.1109/CISIS.2012.210","DOIUrl":"https://doi.org/10.1109/CISIS.2012.210","url":null,"abstract":"In the so-called 'New Culture for Assessment' assessment has become a tool for Learning. Assessment is no more considered to be isolated from the learning process and provided as embedded assessment forms. Nevertheless, students have more responsibility in the learning process in general and in assessment activities in particular. They become more engaged in: developing assessment criteria, participating in self, peer-assessments, reflecting on their own learning, monitoring their performance, and utilizing feedback to adapt their knowledge, skills, and behavior. Consequently, assessment tools have emerged from being stand-alone represented by monolithic systems through modular assessment tools to more flexible and interoperable generation by adopting the service-oriented architecture and modern learning specifications and standards. The new generation holds great promise when it comes to having interoperable learning services and tools within more personalized and adaptive e-learning platforms. In this paper, integrated automated assessment forms provided through flexible and SOA-based tools are discussed. Moreover, it presents a show case of how these forms have been integrated with a Complex Learning Resource (CLR) and used for self-directed learning. The results of the study show, that the developed tool for self-directed learning supports students in their learning process.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127437341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Highways tend to get congested because of the increase in the number of cars travelling on them. There are two solutions to this. The first one, which is also expensive, consists in building new highways to support the traffic. A much cheaper alternative consists in the introduction of advanced ITC software control systems to support the traffic and increase the efficiency of the already existing highways. ILRSH is such a software control system. It is designed to assist and automate the use of a highway lane as a reserved lane. The idea is to allow and support drivers to travel at a high speed, if in return they are willing to pay a small fee to reserve an empty virtual slot on the reserved lane. This slot is valid for a portion and of the highway and a time window, so each driver pays the fee depending on its travelling needs. In return, drivers are guaranteed a congestion free travel on that portion. In this paper we present the proposed architecture of the ILRSH and its subsystems. The system is based on several proposed algorithms designed to assist the drivers enter or exit the reserved lane, based on real-world driving observations. We present simulation results showing the feasibility of the proposed approach, and the increase in traffic efficiency.
{"title":"ILRSH: Intelligent Lane Reservation System for Highway(s)","authors":"C. Dobre, V. Cristea, L. Iftode","doi":"10.1109/CISIS.2012.111","DOIUrl":"https://doi.org/10.1109/CISIS.2012.111","url":null,"abstract":"Highways tend to get congested because of the increase in the number of cars travelling on them. There are two solutions to this. The first one, which is also expensive, consists in building new highways to support the traffic. A much cheaper alternative consists in the introduction of advanced ITC software control systems to support the traffic and increase the efficiency of the already existing highways. ILRSH is such a software control system. It is designed to assist and automate the use of a highway lane as a reserved lane. The idea is to allow and support drivers to travel at a high speed, if in return they are willing to pay a small fee to reserve an empty virtual slot on the reserved lane. This slot is valid for a portion and of the highway and a time window, so each driver pays the fee depending on its travelling needs. In return, drivers are guaranteed a congestion free travel on that portion. In this paper we present the proposed architecture of the ILRSH and its subsystems. The system is based on several proposed algorithms designed to assist the drivers enter or exit the reserved lane, based on real-world driving observations. We present simulation results showing the feasibility of the proposed approach, and the increase in traffic efficiency.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125828260","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Positive expiratory pressure (PEP) therapy is an effective method for removing mucus build-up in the lungs of sufferers of chronic lung diseases such as cystic fibrosis (CF). However, the compliance by young children and adolescents to undertake such physiotherapy can lead to confrontation and stressful situations within families, and can impact on the health of the individual. We have developed game software which is controlled through breathing into a PEP mask or mouthpiece using an air pressure sensor to interface with the PC. By combining games with mucus clearing devices, it could provide a powerful means of encouraging children, teenagers and adults to engage more frequently, and effectively, with vital mucus clearance physiotherapy. This paper presents promising initial results and describes further usability testing plans.
{"title":"Using Serious Games to Motivate Children with Cystic Fibrosis to Engage with Mucus Clearance Physiotherapy","authors":"A. Oikonomou, David Day","doi":"10.1109/CISIS.2012.108","DOIUrl":"https://doi.org/10.1109/CISIS.2012.108","url":null,"abstract":"Positive expiratory pressure (PEP) therapy is an effective method for removing mucus build-up in the lungs of sufferers of chronic lung diseases such as cystic fibrosis (CF). However, the compliance by young children and adolescents to undertake such physiotherapy can lead to confrontation and stressful situations within families, and can impact on the health of the individual. We have developed game software which is controlled through breathing into a PEP mask or mouthpiece using an air pressure sensor to interface with the PC. By combining games with mucus clearing devices, it could provide a powerful means of encouraging children, teenagers and adults to engage more frequently, and effectively, with vital mucus clearance physiotherapy. This paper presents promising initial results and describes further usability testing plans.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117000198","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Fiannaca, S. Gaglio, M. L. Rosa, R. Rizzo, A. Urso
In this paper a new intelligent system designed to support the researcher in the development of a workflow for bioinformatics experiments is presented. The proposed system is capable to suggest one or more strategies in order to resolve the selected problem and to support the user in the assembly of a workflow for complex experiments, using a a Knowledge base, representing the expertise about the application domain, and a Rule-Based system for decision-making activity. Moreover, the system can represent this workflow at different abstraction layers, freeing the user from implementation details and assisting him in the correct configuration of the algorithms. A sample workflow for protein complex extraction from protein-protein interaction network is presented in order to show the main features of the proposed workflow representation.
{"title":"An Intelligent System for Building Bioinformatics Workflows","authors":"A. Fiannaca, S. Gaglio, M. L. Rosa, R. Rizzo, A. Urso","doi":"10.1109/CISIS.2012.141","DOIUrl":"https://doi.org/10.1109/CISIS.2012.141","url":null,"abstract":"In this paper a new intelligent system designed to support the researcher in the development of a workflow for bioinformatics experiments is presented. The proposed system is capable to suggest one or more strategies in order to resolve the selected problem and to support the user in the assembly of a workflow for complex experiments, using a a Knowledge base, representing the expertise about the application domain, and a Rule-Based system for decision-making activity. Moreover, the system can represent this workflow at different abstraction layers, freeing the user from implementation details and assisting him in the correct configuration of the algorithms. A sample workflow for protein complex extraction from protein-protein interaction network is presented in order to show the main features of the proposed workflow representation.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123999153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Stelios Sotiriadis, N. Bessis, F. Xhafa, N. Antonopoulos
Cloud computing provides an efficient and flexible means for various services to meet the diverse and escalating needs of IT end-users. It offers novel functionalities including the utilization of remote services in addition to the virtualization technology. The latter feature offers an efficient method to harness the cloud power by fragmenting a cloud physical host in small manageable virtual portions. As a norm, the virtualized parts are generated by the cloud provider administrator through the hyper visor software based on a generic need for various services. However, several obstacles arise from this generalized and static approach. In this paper, we study and propose a model for instantiating dynamically virtual machines in relation to the current job characteristics. Following, we simulate a virtualized cloud environment in order to evaluate the model's dynamic-ness by measuring the correlation of virtual machines to hosts for certain job variations. This will allow us to compute the expected average execution time of various virtual machines instantiations per job length.
{"title":"Cloud Virtual Machine Scheduling: Modelling the Cloud Virtual Machine Instantiation","authors":"Stelios Sotiriadis, N. Bessis, F. Xhafa, N. Antonopoulos","doi":"10.1109/CISIS.2012.113","DOIUrl":"https://doi.org/10.1109/CISIS.2012.113","url":null,"abstract":"Cloud computing provides an efficient and flexible means for various services to meet the diverse and escalating needs of IT end-users. It offers novel functionalities including the utilization of remote services in addition to the virtualization technology. The latter feature offers an efficient method to harness the cloud power by fragmenting a cloud physical host in small manageable virtual portions. As a norm, the virtualized parts are generated by the cloud provider administrator through the hyper visor software based on a generic need for various services. However, several obstacles arise from this generalized and static approach. In this paper, we study and propose a model for instantiating dynamically virtual machines in relation to the current job characteristics. Following, we simulate a virtualized cloud environment in order to evaluate the model's dynamic-ness by measuring the correlation of virtual machines to hosts for certain job variations. This will allow us to compute the expected average execution time of various virtual machines instantiations per job length.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"29 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123214047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
More than 5% of adults suffer from different types of kidney disease, and millions of people die prematurely from cardiovascular diseases associated with chronic kidney disease (CKD) in each year. The best way to reduce death caused by kidney disease is early prophylaxis and treatment, and which could be achieved through accurate and reliable diagnoses at the early stage. Among various diagnostic methods, ultrasonographic diagnosis is a low-cost, convenient, non-invasive, and timeliness method. Most importantly, this type inspection would not cause extra burden for patients who suffer kidney diseases. This paper presents a computer-aided diagnosis tool based on analyzing ultrasonography images, and the developed system could detect and classify different stages of CKD. The image processing techniques focus on detecting the atrophy of kidney and the proportion of fibrosis conditions within kidney tissues. The system includes image in painting, noise filtering, contour detection, local contrast enhancement, tissue clustering, and quantitative indicator measuring for distinguishing various stages of CKD. This study has collected thousands of ultrasonic images from patients with kidney diseases, and the selected representative CKD images were applied to be pre-analyzed and trained for comparison. The calculated transition locations as reference indicators could provide physicians an auxiliary and objective computer-aid diagnosis tool for CKD identification and classification.
{"title":"Ultrasonography Image Analysis for Detection and Classification of Chronic Kidney Disease","authors":"C. Ho, Tun-Wen Pai, Yuan-Chi Peng, Chien-Hung Lee, Yun-Chih Chen, Yang-Ting Chen, Kuo-Su Chen","doi":"10.1109/CISIS.2012.180","DOIUrl":"https://doi.org/10.1109/CISIS.2012.180","url":null,"abstract":"More than 5% of adults suffer from different types of kidney disease, and millions of people die prematurely from cardiovascular diseases associated with chronic kidney disease (CKD) in each year. The best way to reduce death caused by kidney disease is early prophylaxis and treatment, and which could be achieved through accurate and reliable diagnoses at the early stage. Among various diagnostic methods, ultrasonographic diagnosis is a low-cost, convenient, non-invasive, and timeliness method. Most importantly, this type inspection would not cause extra burden for patients who suffer kidney diseases. This paper presents a computer-aided diagnosis tool based on analyzing ultrasonography images, and the developed system could detect and classify different stages of CKD. The image processing techniques focus on detecting the atrophy of kidney and the proportion of fibrosis conditions within kidney tissues. The system includes image in painting, noise filtering, contour detection, local contrast enhancement, tissue clustering, and quantitative indicator measuring for distinguishing various stages of CKD. This study has collected thousands of ultrasonic images from patients with kidney diseases, and the selected representative CKD images were applied to be pre-analyzed and trained for comparison. The calculated transition locations as reference indicators could provide physicians an auxiliary and objective computer-aid diagnosis tool for CKD identification and classification.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123687736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The most important element of industrial software development is the creation of a common vocabulary of terms for exchanging information between software and industrial engineers. Based on this cooperation, technical domain knowledge is converted into data structures, algorithms and rules. Currently, when people are used to receiving short and quick messages, the most efficient way of knowledge extraction is work on examples or mockups to facilitate better understanding of the problem. Shorter rounds in the presentation of mockups allows continuous work on live object models rather than specifications which make experts more open for sharing their knowledge and provides quicker and more reliable feedback on the data structure and the completeness of the model. Latest research and progress in the area of Model Driven Architecture (MDA) resulted in advanced tools for the creation of models, automatic source code generation as well as whole frameworks for creating application skeletons based on these models. In this paper a collaborative process which uses MDA approach (model, tools and frameworks) for extracting knowledge from domain experts is presented. During presented process, a cooperation of a software engineer and a domain expert via phone calls and one live workshop resulted in a complete model of machine and drive including specific machine features and diagnostic processes. Finally, a working diagnostics application was verified by the domain expert proving that MDA resulted in the expected results. The diagnostics application was verified on real data collected on the winding machine for more than one month, collected diagnostics data included more than 150 signals and 20Gb of raw analog data to dig into before getting condensed diagnostics results. Additionally to the process itself, the article presents identified risks, benefits from applying the MDA approach and lessons learned from applying this new innovative process. For further work, the possibilities of extending and dynamically extending existing models should be studied. In previous works we have focused on an ontology based approach, which does not meet all expectations when it comes to application in real world environment. As simpler and more mature technology, MDA was shown to be more productive and easier to adapt for building industrial applications.
{"title":"Model Driven Architecture for Industrial Applications","authors":"Maciej Zygmunt, M. Budyn","doi":"10.1109/CISIS.2012.140","DOIUrl":"https://doi.org/10.1109/CISIS.2012.140","url":null,"abstract":"The most important element of industrial software development is the creation of a common vocabulary of terms for exchanging information between software and industrial engineers. Based on this cooperation, technical domain knowledge is converted into data structures, algorithms and rules. Currently, when people are used to receiving short and quick messages, the most efficient way of knowledge extraction is work on examples or mockups to facilitate better understanding of the problem. Shorter rounds in the presentation of mockups allows continuous work on live object models rather than specifications which make experts more open for sharing their knowledge and provides quicker and more reliable feedback on the data structure and the completeness of the model. Latest research and progress in the area of Model Driven Architecture (MDA) resulted in advanced tools for the creation of models, automatic source code generation as well as whole frameworks for creating application skeletons based on these models. In this paper a collaborative process which uses MDA approach (model, tools and frameworks) for extracting knowledge from domain experts is presented. During presented process, a cooperation of a software engineer and a domain expert via phone calls and one live workshop resulted in a complete model of machine and drive including specific machine features and diagnostic processes. Finally, a working diagnostics application was verified by the domain expert proving that MDA resulted in the expected results. The diagnostics application was verified on real data collected on the winding machine for more than one month, collected diagnostics data included more than 150 signals and 20Gb of raw analog data to dig into before getting condensed diagnostics results. Additionally to the process itself, the article presents identified risks, benefits from applying the MDA approach and lessons learned from applying this new innovative process. For further work, the possibilities of extending and dynamically extending existing models should be studied. In previous works we have focused on an ontology based approach, which does not meet all expectations when it comes to application in real world environment. As simpler and more mature technology, MDA was shown to be more productive and easier to adapt for building industrial applications.","PeriodicalId":158978,"journal":{"name":"2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121499160","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}