{"title":"Intrusion detection based on clustering a data stream","authors":"S. Oh, Jin-Suk Kang, Y. Byun, T. Jeong, W. Lee","doi":"10.1007/11836810_30","DOIUrl":"https://doi.org/10.1007/11836810_30","url":null,"abstract":"","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"27 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133136706","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents arguments for including the properties of processes involved in various approaches to component-based software development in predicting systems properties. It discusses how processes impact on system properties and relates the issues raised to standards that already address process and product quality. Although many standards still apply, CBD changes interpretations and emphases.
{"title":"Requirements for CBD products and process quality","authors":"Haeng-Kon Kim, R. Lee, Hae-Sool Yang","doi":"10.1109/SERA.2005.57","DOIUrl":"https://doi.org/10.1109/SERA.2005.57","url":null,"abstract":"This paper presents arguments for including the properties of processes involved in various approaches to component-based software development in predicting systems properties. It discusses how processes impact on system properties and relates the issues raised to standards that already address process and product quality. Although many standards still apply, CBD changes interpretations and emphases.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"175 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115961728","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents deployment factory, a model-driven unified environment for deploying component-based applications. While there are projects aiming to develop a unified deployment environment for component-based applications, none of them is generic enough - they do not support heterogeneous applications, they are targeted for a single component technology and/or impose modifications of the underlying technologies. The deployment factory targets all these issues. It is based on (i) the OMG deployment and configuration specification, (ii) an analysis of contemporary used component technologies, and (iii) our experience from component-based development. Moreover, the paper also shows that a plain MDA approach (the one used in the OMG deployment and configuration specification) for building real systems is not always appropriate.
{"title":"A model-driven environment for component deployment","authors":"P. Hnetynka","doi":"10.1109/SERA.2005.12","DOIUrl":"https://doi.org/10.1109/SERA.2005.12","url":null,"abstract":"This paper presents deployment factory, a model-driven unified environment for deploying component-based applications. While there are projects aiming to develop a unified deployment environment for component-based applications, none of them is generic enough - they do not support heterogeneous applications, they are targeted for a single component technology and/or impose modifications of the underlying technologies. The deployment factory targets all these issues. It is based on (i) the OMG deployment and configuration specification, (ii) an analysis of contemporary used component technologies, and (iii) our experience from component-based development. Moreover, the paper also shows that a plain MDA approach (the one used in the OMG deployment and configuration specification) for building real systems is not always appropriate.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126066714","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper proposes a cognitive model to classify the level of cognition and cognitive activities in program comprehension. This model is composed of input, cognitive process and output, among which the cognitive process includes four activities (absorption, denial, reorganization and expulsion) at six Bloom learning levels. Compared with the existing models, our learning model is more complete and more detailed. It not only describes cognitive activities in detail, but can also be applied in most of the cases. Our model can also reveal the differences between experts and novices in program comprehension. It provides some useful insights on how to build a tool to aid program comprehension. We also find that the so-called traditional program comprehension process involves activities not only at comprehension level of the Bloom's taxonomy, but at higher levels as well. A case study is conducted to validate this learning model.
{"title":"A cognitive model for program comprehension","authors":"Shaochun Xu","doi":"10.1109/SERA.2005.2","DOIUrl":"https://doi.org/10.1109/SERA.2005.2","url":null,"abstract":"This paper proposes a cognitive model to classify the level of cognition and cognitive activities in program comprehension. This model is composed of input, cognitive process and output, among which the cognitive process includes four activities (absorption, denial, reorganization and expulsion) at six Bloom learning levels. Compared with the existing models, our learning model is more complete and more detailed. It not only describes cognitive activities in detail, but can also be applied in most of the cases. Our model can also reveal the differences between experts and novices in program comprehension. It provides some useful insights on how to build a tool to aid program comprehension. We also find that the so-called traditional program comprehension process involves activities not only at comprehension level of the Bloom's taxonomy, but at higher levels as well. A case study is conducted to validate this learning model.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120991733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Retrofitting security requirement into an existing system tends to result in less wanted qualities. So, it is a preferred practice to design with security in mind right from the beginning of the development process. An NFR framework has been established to incorporate non-functional requirements (NFRs) (L. Chung et al., 2000) that are crucial to secure system design into the development process. In this paper, we propose a methodology that utilizes the NFR framework to come up with secure design by selecting security design patterns for the domain specific application such as e-commerce system.
将安全需求改造到现有系统中往往会导致不太需要的质量。因此,从开发过程的开始就考虑安全性是一种首选的设计实践。已经建立了一个NFR框架来合并非功能需求(NFRs) (L. Chung et al., 2000),这对于确保系统设计进入开发过程至关重要。在本文中,我们提出了一种利用NFR框架的方法,通过为特定领域的应用(如电子商务系统)选择安全设计模式来进行安全设计。
{"title":"Analysis of secure design patterns: a case study in e-commerce system","authors":"Jing Wang, Yeong-Tae Song, L. Chung","doi":"10.1109/SERA.2005.22","DOIUrl":"https://doi.org/10.1109/SERA.2005.22","url":null,"abstract":"Retrofitting security requirement into an existing system tends to result in less wanted qualities. So, it is a preferred practice to design with security in mind right from the beginning of the development process. An NFR framework has been established to incorporate non-functional requirements (NFRs) (L. Chung et al., 2000) that are crucial to secure system design into the development process. In this paper, we propose a methodology that utilizes the NFR framework to come up with secure design by selecting security design patterns for the domain specific application such as e-commerce system.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121523999","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gihan Kim, Minkwang Lee, Jong-Soo Lee, Kyungwhan Lee
With growing interest in software process improvement (SPI), many companies are introducing international process models and standards. SPICE is most widely used process assessment model in the SPI work today. In the process of introducing and applying SPICE, practical experiences contribute to enhancing the project performance. The experience helps people to make decisions under uncertainty, and to find better compromises. This paper suggests a SPICE experience factory (SEF) model to use SPICE assessment experience. For this, we collected SPICE assessment results which were conducted in Korea from 1999 to 2004. The collected data does not only contain rating information but also specifies strengths and improvement point for each assessed company and its process. To use this assessment result more efficiently, root words were derived from each result items. And root words were classified into four: 1) measurement, 2) work product, 3) process performance, and 4) process definition and deployment. Database was designed and constructed to save all analyzed data in forms of root words. Database also was designed to efficiently search information the organization needs by strength/improvement point, or root word for each level. This paper describes procedures of SEF model and presents methods to utilize it. By using the proposed SEF model, even organizations which plan to undergo SPICE assessment for the first time can establish the optimal improvement strategies.
{"title":"Design of SPICE experience factory model for accumulation and utilization of process assessment experience","authors":"Gihan Kim, Minkwang Lee, Jong-Soo Lee, Kyungwhan Lee","doi":"10.1109/SERA.2005.34","DOIUrl":"https://doi.org/10.1109/SERA.2005.34","url":null,"abstract":"With growing interest in software process improvement (SPI), many companies are introducing international process models and standards. SPICE is most widely used process assessment model in the SPI work today. In the process of introducing and applying SPICE, practical experiences contribute to enhancing the project performance. The experience helps people to make decisions under uncertainty, and to find better compromises. This paper suggests a SPICE experience factory (SEF) model to use SPICE assessment experience. For this, we collected SPICE assessment results which were conducted in Korea from 1999 to 2004. The collected data does not only contain rating information but also specifies strengths and improvement point for each assessed company and its process. To use this assessment result more efficiently, root words were derived from each result items. And root words were classified into four: 1) measurement, 2) work product, 3) process performance, and 4) process definition and deployment. Database was designed and constructed to save all analyzed data in forms of root words. Database also was designed to efficiently search information the organization needs by strength/improvement point, or root word for each level. This paper describes procedures of SEF model and presents methods to utilize it. By using the proposed SEF model, even organizations which plan to undergo SPICE assessment for the first time can establish the optimal improvement strategies.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122185985","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
For QoS-guaranteed realtime multimedia differentiated services provisioning across multiple AS (autonomous system) domain network, session configuration with SIP/SDP and bidirectional connection establishment using UNI & NNI signaling (i.e., RSVP-TE) are essential. Also, a scalable transit networking scheme must be provided so as to configure scalable per-class-type QoS-guaranteed packet processing and to provide scalable connection admission control (CAC). In this paper, we analyze the functional architecture of session & connection management with SIP/SDP, RSVP-TE, COPS (common open policy service)-based CAC and QoS-guaranteed virtual overlay networking. We also evaluate the overall interaction procedure among functional modules for scalable QoS-guaranteed DiffServ provisioning across multiple autonomous system (AS) domain networks.
对于跨多个AS(自治系统)域网络提供qos保证的实时多媒体差异化服务,使用SIP/SDP配置会话和使用UNI和NNI信令(即RSVP-TE)建立双向连接是必不可少的。此外,必须提供可扩展的传输网络方案,以便配置可扩展的每类类型qos保证的数据包处理,并提供可扩展的连接允许控制(CAC)。本文分析了基于SIP/SDP、RSVP-TE、cop (common open policy service)的CAC和qos保证的虚拟覆盖网络的会话和连接管理的功能体系结构。我们还评估了跨多个自治系统(AS)域网络的可扩展qos保证的DiffServ配置的功能模块之间的整体交互过程。
{"title":"Inter-AS session & connection management for QoS-guaranteed DiffServ provisioning","authors":"Young-Tak Kim","doi":"10.1109/SERA.2005.48","DOIUrl":"https://doi.org/10.1109/SERA.2005.48","url":null,"abstract":"For QoS-guaranteed realtime multimedia differentiated services provisioning across multiple AS (autonomous system) domain network, session configuration with SIP/SDP and bidirectional connection establishment using UNI & NNI signaling (i.e., RSVP-TE) are essential. Also, a scalable transit networking scheme must be provided so as to configure scalable per-class-type QoS-guaranteed packet processing and to provide scalable connection admission control (CAC). In this paper, we analyze the functional architecture of session & connection management with SIP/SDP, RSVP-TE, COPS (common open policy service)-based CAC and QoS-guaranteed virtual overlay networking. We also evaluate the overall interaction procedure among functional modules for scalable QoS-guaranteed DiffServ provisioning across multiple autonomous system (AS) domain networks.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"5 15","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120820989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mobile agents provide a new abstraction for deploying functionality over the existing Internet infrastructure. In the MA (mobile agents) framework there are no agent platforms. Instead applications become agent-enabled by using simple JavaBeans components. In this paper we present an architecture that allows currently available Web servers to become capable of sending and receiving agents in an easy way. By using this approach, existing Web infrastructure can be maintained, while gaining a whole new potential by being able to make use of agent technology. Our approach involves wrapping the components inside a Java Servlet that can be included in any Web server supporting the Servlet specification. This Servlet enables the servers to receive and send agents that can query local information, and also enables the agents to behave as Servlets themselves. We currently have used the framework with several existing commercial Web servers, inclusively having the security mechanisms of the framework correctly running and integrated with the security architecture of the server.
{"title":"A component-based approach for integrating mobile agents into the existing Web infrastructure","authors":"Haeng-Kon Kim","doi":"10.1109/SERA.2005.5","DOIUrl":"https://doi.org/10.1109/SERA.2005.5","url":null,"abstract":"Mobile agents provide a new abstraction for deploying functionality over the existing Internet infrastructure. In the MA (mobile agents) framework there are no agent platforms. Instead applications become agent-enabled by using simple JavaBeans components. In this paper we present an architecture that allows currently available Web servers to become capable of sending and receiving agents in an easy way. By using this approach, existing Web infrastructure can be maintained, while gaining a whole new potential by being able to make use of agent technology. Our approach involves wrapping the components inside a Java Servlet that can be included in any Web server supporting the Servlet specification. This Servlet enables the servers to receive and send agents that can query local information, and also enables the agents to behave as Servlets themselves. We currently have used the framework with several existing commercial Web servers, inclusively having the security mechanisms of the framework correctly running and integrated with the security architecture of the server.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134521316","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The k-nearest neighbor (KNN) classification is a simple and effective classification approach. However, improving performance of the classifier is still attractive. Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding that significantly improve the classifier such as decision trees, rule learners, or neural networks. Unfortunately, these combining methods developed do not improve the nearest neighbor classifiers. In this paper, first, we present a new approach to combine multiple KNN classifiers based on different distance functions, in which we apply multiple distance functions to improve the performance of the k-nearest neighbor classifier. Second, we develop a combining method, in which the weights of the distance function are learnt by genetic algorithm. Finally, combining classifiers in error correcting output coding, are discussed. The proposed algorithms seek to increase generalization accuracy when compared to the basic k-nearest neighbor algorithm. Experiments have been conducted on some benchmark datasets from the UCI machine learning repository. The results show that the proposed algorithms improve the performance of the k-nearest neighbor classification.
{"title":"Combining classification improvements by ensemble processing","authors":"N. Ishii, Eisuke Tsuchiya, Y. Bao, N. Yamaguchi","doi":"10.1109/SERA.2005.30","DOIUrl":"https://doi.org/10.1109/SERA.2005.30","url":null,"abstract":"The k-nearest neighbor (KNN) classification is a simple and effective classification approach. However, improving performance of the classifier is still attractive. Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding that significantly improve the classifier such as decision trees, rule learners, or neural networks. Unfortunately, these combining methods developed do not improve the nearest neighbor classifiers. In this paper, first, we present a new approach to combine multiple KNN classifiers based on different distance functions, in which we apply multiple distance functions to improve the performance of the k-nearest neighbor classifier. Second, we develop a combining method, in which the weights of the distance function are learnt by genetic algorithm. Finally, combining classifiers in error correcting output coding, are discussed. The proposed algorithms seek to increase generalization accuracy when compared to the basic k-nearest neighbor algorithm. Experiments have been conducted on some benchmark datasets from the UCI machine learning repository. The results show that the proposed algorithms improve the performance of the k-nearest neighbor classification.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122555545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This research developed the technique necessary for the adaptation of components and a tool which supports this. The adaptation of components becomes necessary during the process of reusing or assembling components, and this is because the interface of the component is, in many cases, different than the component the developer wishes to assemble. Occasionally, additional attributes may need to be defined in accordance to new requirements. Consequently, the process for component adaptation is crucial for the reuse and assembly of components. In order to support the adaptation of components, this research proposes an adaptation technique dependent upon binary component adaptation techniques and adaptation components. In addition, a support tool was developed to support an effective adaptation process.
{"title":"Binary component adaptation technique and supporting tool","authors":"Jeong Ah Kim, Kyung-Whan Lee","doi":"10.1109/SERA.2005.26","DOIUrl":"https://doi.org/10.1109/SERA.2005.26","url":null,"abstract":"This research developed the technique necessary for the adaptation of components and a tool which supports this. The adaptation of components becomes necessary during the process of reusing or assembling components, and this is because the interface of the component is, in many cases, different than the component the developer wishes to assemble. Occasionally, additional attributes may need to be defined in accordance to new requirements. Consequently, the process for component adaptation is crucial for the reuse and assembly of components. In order to support the adaptation of components, this research proposes an adaptation technique dependent upon binary component adaptation techniques and adaptation components. In addition, a support tool was developed to support an effective adaptation process.","PeriodicalId":424175,"journal":{"name":"Third ACIS Int'l Conference on Software Engineering Research, Management and Applications (SERA'05)","volume":"105 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113991907","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}