Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960645
H. Lam, P. Maheshwari
The Distributed Software Project Management Tool (DSPMtool) is an integration of tools opening the realms of software project management to users distributed across the world. The first prototype presented the core of a software repository and configuration management. The authors present DSPMtool's second prototype and introduce new concepts of task and team management to improve the quality of software projects. Pursuing a task management system, no doubt introduces the need for a self-monitoring mechanism, which DSPMtool successfully provides. Born and raised in a Visual Basic environment, the second prototype continues to adapt to this environment, incorporating the new design into the existing architecture. The DSPMtool utilises Component Object Modeling (COM) and ActiveX technologies, as well as employing object-oriented software design methodology to build the architecture for the DSPMtool second prototype.
{"title":"Task and team management in the Distributed Software Project Management Tool","authors":"H. Lam, P. Maheshwari","doi":"10.1109/CMPSAC.2001.960645","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960645","url":null,"abstract":"The Distributed Software Project Management Tool (DSPMtool) is an integration of tools opening the realms of software project management to users distributed across the world. The first prototype presented the core of a software repository and configuration management. The authors present DSPMtool's second prototype and introduce new concepts of task and team management to improve the quality of software projects. Pursuing a task management system, no doubt introduces the need for a self-monitoring mechanism, which DSPMtool successfully provides. Born and raised in a Visual Basic environment, the second prototype continues to adapt to this environment, incorporating the new design into the existing architecture. The DSPMtool utilises Component Object Modeling (COM) and ActiveX technologies, as well as employing object-oriented software design methodology to build the architecture for the DSPMtool second prototype.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127562659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960594
Lu Zhang, Hong Mei, Hong Zhu
Component-based software development has been viewed as an emerging paradigm of software development. This paper analyzes the requirements of configuration management in component-based development process. Based on the analysis, a prototype configuration management system is proposed to meet the requirements. An example of using the system is also given.
{"title":"A configuration management system supporting component-based software development","authors":"Lu Zhang, Hong Mei, Hong Zhu","doi":"10.1109/CMPSAC.2001.960594","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960594","url":null,"abstract":"Component-based software development has been viewed as an emerging paradigm of software development. This paper analyzes the requirements of configuration management in component-based development process. Based on the analysis, a prototype configuration management system is proposed to meet the requirements. An example of using the system is also given.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131128820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960624
Shu Wenhui, Daniel T. H. Tan
Intrusion detection (ID) has become an important technology for protecting information resources and databases from malicious attacks and information leakage. This paper proposes a novel two-layer mechanism to detect intrusions against a web-based database service. Layer one builds historical profiles based on audit trails and other log data provided by the web server and database server. Pre-alarms will be triggered if anomalies occurred. Layer two makes further analysis on the pre-alarms generated from Layer one. Such methods integrates the alarm context with the alarms themselves rather than a simple "analysis in isolation". This can reduce the error rates, especially false positives and greatly improve the accuracy of intrusion detection, alarm notification and hence more effective incident handling.
{"title":"A novel intrusion detection system model for securing web-based database systems","authors":"Shu Wenhui, Daniel T. H. Tan","doi":"10.1109/CMPSAC.2001.960624","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960624","url":null,"abstract":"Intrusion detection (ID) has become an important technology for protecting information resources and databases from malicious attacks and information leakage. This paper proposes a novel two-layer mechanism to detect intrusions against a web-based database service. Layer one builds historical profiles based on audit trails and other log data provided by the web server and database server. Pre-alarms will be triggered if anomalies occurred. Layer two makes further analysis on the pre-alarms generated from Layer one. Such methods integrates the alarm context with the alarms themselves rather than a simple \"analysis in isolation\". This can reduce the error rates, especially false positives and greatly improve the accuracy of intrusion detection, alarm notification and hence more effective incident handling.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"469 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133044348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960595
J. Pazdziora
Most of the current Web applications are closely tied to the client and large parts of their code concern with producing proper output markup. Converting them into components that produce pure data output, independent from presentation specifics, increases maintainability and eliminates duplicated code for different output media, and thus is the key requirement for modem systems. In this paper we explore steps needed to transfer existing applications to data oriented components, and present implementation and performance issues and their solutions.
{"title":"Converting Web applications to data components: design methodology","authors":"J. Pazdziora","doi":"10.1109/CMPSAC.2001.960595","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960595","url":null,"abstract":"Most of the current Web applications are closely tied to the client and large parts of their code concern with producing proper output markup. Converting them into components that produce pure data output, independent from presentation specifics, increases maintainability and eliminates duplicated code for different output media, and thus is the key requirement for modem systems. In this paper we explore steps needed to transfer existing applications to data oriented components, and present implementation and performance issues and their solutions.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"159 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115913082","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960638
S. Vilkomir, Jonathan P. Bowen
Describes an approach to formalization of criteria of computer systems software testing. A brief review of control-flow criteria is introduced. As a formal language for describing the criteria, the Z notation is selected. Z schemas are presented for definitions of the following criteria: statement coverage, decision coverage, condition coverage, decision/condition coverage, full predicate coverage, modified condition/decision coverage, and multiple condition coverage. This characterization could help in the correct understanding of different types of testing and also the correct application of a desired testing regime.
{"title":"Formalization of software testing criteria using the Z notation","authors":"S. Vilkomir, Jonathan P. Bowen","doi":"10.1109/CMPSAC.2001.960638","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960638","url":null,"abstract":"Describes an approach to formalization of criteria of computer systems software testing. A brief review of control-flow criteria is introduced. As a formal language for describing the criteria, the Z notation is selected. Z schemas are presented for definitions of the following criteria: statement coverage, decision coverage, condition coverage, decision/condition coverage, full predicate coverage, modified condition/decision coverage, and multiple condition coverage. This characterization could help in the correct understanding of different types of testing and also the correct application of a desired testing regime.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129395334","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960671
A. Romanovsky
Designers of component-based software face two problems related to dealing with abnormal events: developing exception handling at the level of the integrated system and accommodating (and adjusting, if necessary) exceptions and exception handling provided by individual components. Our intention is to develop an exception handling framework suitable for component-based system development by applying general exception handling mechanisms which have been proposed and successfully used in concurrent/distributed systems and in programming languages. The framework is applied in three steps. Firstly, individual components are wrapped in such a way that the wrappers perform activity related to local error detection and exception handling, and signal, if necessary, external exceptions outside the component. At the second step the execution of the overall system is structured as a set of dynamic actions in which components take parts. Such actions have important properties which facilitate exception handling: they are atomic, contain erroneous information and serve as recovery regions. The last step is designing exception handling at the action level: each action (i.e. all components participating in it) handles exceptions signalled by individual wrapped components.
{"title":"Exception handling in component-based system development","authors":"A. Romanovsky","doi":"10.1109/CMPSAC.2001.960671","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960671","url":null,"abstract":"Designers of component-based software face two problems related to dealing with abnormal events: developing exception handling at the level of the integrated system and accommodating (and adjusting, if necessary) exceptions and exception handling provided by individual components. Our intention is to develop an exception handling framework suitable for component-based system development by applying general exception handling mechanisms which have been proposed and successfully used in concurrent/distributed systems and in programming languages. The framework is applied in three steps. Firstly, individual components are wrapped in such a way that the wrappers perform activity related to local error detection and exception handling, and signal, if necessary, external exceptions outside the component. At the second step the execution of the overall system is structured as a set of dynamic actions in which components take parts. Such actions have important properties which facilitate exception handling: they are atomic, contain erroneous information and serve as recovery regions. The last step is designing exception handling at the action level: each action (i.e. all components participating in it) handles exceptions signalled by individual wrapped components.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129583077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960676
J. P. Grzymala-Busse, J. Grzymala-Busse, Z. Hippe
One of the important tools for early diagnosis of malignant melanoma is the total dermatoscopy score (TDS), computed using the ABCD (asymmetry, border, color, diameter) formula. Our primary objective was to check whether the ABCD formula is optimal. Using a data set containing 276 cases of melanoma and the LERS (Learning from Examples based on Rough Sets) data mining system, we checked more than 20,000 modified formulas for ABCD, computing the predicted error rate of melanoma diagnosis using 10-fold cross-validation for every modified formula. As a result, we found the optimal ABCD formula for our setup: discretization based on cluster analysis, the LEM2 (Learning from Examples Module, version 2) algorithm (one of the four LERS algorithms for rule induction) and the standard LERS classification scheme. The error rate for the standard ABCD formula was 10.21 %, while for the optimal ABCD formula the error rate was reduced to 6.04%. Some research in melanoma diagnosis shows that the use of the ABCD formula does not improve the error rate. Our research shows that the ABCD formula is useful, since, for our data set, the error rate without the use of the ABCD formula was higher (13.73%).
早期诊断恶性黑色素瘤的重要工具之一是使用ABCD(不对称、边界、颜色、直径)公式计算的皮肤镜总评分(TDS)。我们的主要目标是检查ABCD公式是否是最优的。使用包含276例黑色素瘤病例的数据集和LERS(基于粗糙集的学习示例)数据挖掘系统,我们检查了20,000多个ABCD修改公式,对每个修改公式使用10倍交叉验证计算黑色素瘤诊断的预测错误率。因此,我们为我们的设置找到了最优的ABCD公式:基于聚类分析的离散化,LEM2 (Learning from Examples Module, version 2)算法(用于规则归纳的四种LERS算法之一)和标准的LERS分类方案。ABCD标准配方的误差率为10.21%,最佳ABCD配方的误差率为6.04%。一些黑色素瘤诊断的研究表明,使用ABCD公式并没有提高错误率。我们的研究表明,ABCD公式是有用的,因为对于我们的数据集,不使用ABCD公式的错误率更高(13.73%)。
{"title":"Melanoma prediction using data mining system LERS","authors":"J. P. Grzymala-Busse, J. Grzymala-Busse, Z. Hippe","doi":"10.1109/CMPSAC.2001.960676","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960676","url":null,"abstract":"One of the important tools for early diagnosis of malignant melanoma is the total dermatoscopy score (TDS), computed using the ABCD (asymmetry, border, color, diameter) formula. Our primary objective was to check whether the ABCD formula is optimal. Using a data set containing 276 cases of melanoma and the LERS (Learning from Examples based on Rough Sets) data mining system, we checked more than 20,000 modified formulas for ABCD, computing the predicted error rate of melanoma diagnosis using 10-fold cross-validation for every modified formula. As a result, we found the optimal ABCD formula for our setup: discretization based on cluster analysis, the LEM2 (Learning from Examples Module, version 2) algorithm (one of the four LERS algorithms for rule induction) and the standard LERS classification scheme. The error rate for the standard ABCD formula was 10.21 %, while for the optimal ABCD formula the error rate was reduced to 6.04%. Some research in melanoma diagnosis shows that the use of the ABCD formula does not improve the error rate. Our research shows that the ABCD formula is useful, since, for our data set, the error rate without the use of the ABCD formula was higher (13.73%).","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130006490","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960633
S. Morasca, G. Russo
We studied productivity in a real-life environment in the Italian public administration by applying the goal/question/metrics paradigm to define a productivity related measurement goal and derive measures that were deemed relevant to reach the stated goal. Productivity was studied from both a functional and a product size perspectives. Our study has highlighted a few factors that are related to either aspect of productivity. The results may provide software managers with support for evaluating and improving software processes, so they can make decisions based on more quantitative information.
{"title":"An empirical study of software productivity","authors":"S. Morasca, G. Russo","doi":"10.1109/CMPSAC.2001.960633","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960633","url":null,"abstract":"We studied productivity in a real-life environment in the Italian public administration by applying the goal/question/metrics paradigm to define a productivity related measurement goal and derive measures that were deemed relevant to reach the stated goal. Productivity was studied from both a functional and a product size perspectives. Our study has highlighted a few factors that are related to either aspect of productivity. The results may provide software managers with support for evaluating and improving software processes, so they can make decisions based on more quantitative information.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125729108","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960656
M. Grabner, G. Leonhartsberger, A. Leutgeb, J. Altmann
Remote access of a Programmable Logical Control (PLC) is the prerequisite for an effective maintenance. We present an architecture for the development of tools supporting the installation, configuration, maintenance, supervision, and diagnosis of remote PLCs via the Internet. The developed architecture, called Virtual PLC, consists of a set of extensible components describing a real PLC. The Virtual PLC uses Java/sup TM/ Technology to be platform independent. The Virtual PLC enables developers to rapidly implement tools for remote diagnosis, supervision and maintenance of a real PLC.
{"title":"Java in industrial automation-a virtual PLC","authors":"M. Grabner, G. Leonhartsberger, A. Leutgeb, J. Altmann","doi":"10.1109/CMPSAC.2001.960656","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960656","url":null,"abstract":"Remote access of a Programmable Logical Control (PLC) is the prerequisite for an effective maintenance. We present an architecture for the development of tools supporting the installation, configuration, maintenance, supervision, and diagnosis of remote PLCs via the Internet. The developed architecture, called Virtual PLC, consists of a set of extensible components describing a real PLC. The Virtual PLC uses Java/sup TM/ Technology to be platform independent. The Virtual PLC enables developers to rapidly implement tools for remote diagnosis, supervision and maintenance of a real PLC.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126101903","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-10-08DOI: 10.1109/CMPSAC.2001.960655
T. Ksiezyk, G. Martin, Qing Jia
InfoSleuth is an agent-based system that automates the gathering and analysis of dynamic, distributed data accessible over the web. To gather data, a user specifies an SQL query which references elements from a common domain ontology (i.e., terminology). Agents then locate resources that have advertised having data relevant to these elements and translate the ontology-based query into queries referencing elements in the different schemas of the identified local databases. Other agents then integrate the results returned from these multiple resources and express them in terms of the common ontology. The large amount of data returned may be overwhelming, and so analysis agents serve to filter and interpret it. InfoSleuth is implemented in Java, and includes a common agent shell and specializations of the shell. The system was developed within a research environment over the course of 5 years and is now being hardened for commercial applications.
{"title":"InfoSleuth: agent-based system for data integration and analysis","authors":"T. Ksiezyk, G. Martin, Qing Jia","doi":"10.1109/CMPSAC.2001.960655","DOIUrl":"https://doi.org/10.1109/CMPSAC.2001.960655","url":null,"abstract":"InfoSleuth is an agent-based system that automates the gathering and analysis of dynamic, distributed data accessible over the web. To gather data, a user specifies an SQL query which references elements from a common domain ontology (i.e., terminology). Agents then locate resources that have advertised having data relevant to these elements and translate the ontology-based query into queries referencing elements in the different schemas of the identified local databases. Other agents then integrate the results returned from these multiple resources and express them in terms of the common ontology. The large amount of data returned may be overwhelming, and so analysis agents serve to filter and interpret it. InfoSleuth is implemented in Java, and includes a common agent shell and specializations of the shell. The system was developed within a research environment over the course of 5 years and is now being hardened for commercial applications.","PeriodicalId":269568,"journal":{"name":"25th Annual International Computer Software and Applications Conference. COMPSAC 2001","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114161379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}