Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544157
J. Ke
Taiwan is ranked the third largest IT product manufacturing country worldwide. Although the software production value is only 10% of the production value, the average compound growth rate has reached 20% in the past five years. The main factors behind the growth of the Taiwan software industry are: 1. The promotion of Internet and NII (National Information Infrastructure) projects generated big demand for PCs and its related application. 2. The deployment of large-scale mission-critical information systems has stimulated large investment in system integration and application development. 3. The promotion of IPR awareness and computer literacy has brought increased prosperity to the computer-related market. 4. The Five-year Software Development Plan sponsored by the Ministry of Economic Affairs has built a stronger software industry structure and accelerated the industry development. 5. The software technology R&D funding provided by the Ministry of Economic Affairs motivated both the research institutes and the private sector to create innovative software products. The rate of the Taiwan software industry growth is forecast at over 20% in 1996. This can be attributed to the expectation of continued economic growth, increased demands from the distribution and financial sectors, rapid advances in multimedia and educational software, as well as the growing popularity of information and network applications and services.
{"title":"Software industry in Taiwan","authors":"J. Ke","doi":"10.1109/CMPSAC.1996.544157","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544157","url":null,"abstract":"Taiwan is ranked the third largest IT product manufacturing country worldwide. Although the software production value is only 10% of the production value, the average compound growth rate has reached 20% in the past five years. The main factors behind the growth of the Taiwan software industry are: 1. The promotion of Internet and NII (National Information Infrastructure) projects generated big demand for PCs and its related application. 2. The deployment of large-scale mission-critical information systems has stimulated large investment in system integration and application development. 3. The promotion of IPR awareness and computer literacy has brought increased prosperity to the computer-related market. 4. The Five-year Software Development Plan sponsored by the Ministry of Economic Affairs has built a stronger software industry structure and accelerated the industry development. 5. The software technology R&D funding provided by the Ministry of Economic Affairs motivated both the research institutes and the private sector to create innovative software products. The rate of the Taiwan software industry growth is forecast at over 20% in 1996. This can be attributed to the expectation of continued economic growth, increased demands from the distribution and financial sectors, rapid advances in multimedia and educational software, as well as the growing popularity of information and network applications and services.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"131 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123207955","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.542289
Barry W. Boehm, H. In
The process of resolving cost-requirements conflicts is difficult because of incompatibilities among stakeholders' interests and priorities, complex cost requirements dependencies, and an exponentially increasing option space for larger systems. The paper describes an exploratory knowledge-based tool,the Software Cost Option Strategy Tool (S-COST), for assisting stakeholders to surface appropriate cost resolution options, to visualize the options, and to negotiate a mutually satisfactory balance of requirements and cost. S-COST operates in the context of the USC-CSE WinWin system (a groupware support system for determining software and system requirements as negotiated win conditions), QARCC (a support system for identifying conflicts in quality requirements), and COCOMO (constructive cost estimation model). Initial analyses of its capabilities indicate that its semiautomated approach provides users with improved capabilities for addressing cost-requirements issues.
{"title":"Software Cost Option Strategy Tool (S-COST)","authors":"Barry W. Boehm, H. In","doi":"10.1109/CMPSAC.1996.542289","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.542289","url":null,"abstract":"The process of resolving cost-requirements conflicts is difficult because of incompatibilities among stakeholders' interests and priorities, complex cost requirements dependencies, and an exponentially increasing option space for larger systems. The paper describes an exploratory knowledge-based tool,the Software Cost Option Strategy Tool (S-COST), for assisting stakeholders to surface appropriate cost resolution options, to visualize the options, and to negotiate a mutually satisfactory balance of requirements and cost. S-COST operates in the context of the USC-CSE WinWin system (a groupware support system for determining software and system requirements as negotiated win conditions), QARCC (a support system for identifying conflicts in quality requirements), and COCOMO (constructive cost estimation model). Initial analyses of its capabilities indicate that its semiautomated approach provides users with improved capabilities for addressing cost-requirements issues.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128494229","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544148
Jonathan Lee, L. F. Lai, Wei T. Huang
We propose the use of task based specifications in conceptual graphs to construct and verify a conceptual model. Task based specification methodology is used to serve as the mechanism to structure the knowledge captured in the conceptual model; whereas, conceptual graphs are adopted as the formalism to express task based specifications. Verifying a conceptual model is performed on model specifications of a task through constraint satisfaction and relaxation techniques, and on process specifications of the task based on the resolution algorithm and the notion of specificity.
{"title":"A task-based approach to verifying conceptual models","authors":"Jonathan Lee, L. F. Lai, Wei T. Huang","doi":"10.1109/CMPSAC.1996.544148","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544148","url":null,"abstract":"We propose the use of task based specifications in conceptual graphs to construct and verify a conceptual model. Task based specification methodology is used to serve as the mechanism to structure the knowledge captured in the conceptual model; whereas, conceptual graphs are adopted as the formalism to express task based specifications. Verifying a conceptual model is performed on model specifications of a task through constraint satisfaction and relaxation techniques, and on process specifications of the task based on the resolution algorithm and the notion of specificity.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129370971","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544601
C. Y. Chen, Chinchen Chang, Richard C. T. Lee, D. Lin
We are concerned with the problem of designing optimal linear hashing files for orthogonal range retrieval. Through the study of performance expressions, we show that optimal basic linear hashing files and optimal recursive linear hashing files for orthogonal range retrieval can be produced, in certain cases, by a greedy method called the MMI (minimum marginal increase) method; and it is pointed out that optimal linear hashing files for partial match retrieval need not be optimal for orthogonal range retrieval.
{"title":"Optimal linear hashing files for orthogonal range retrieval","authors":"C. Y. Chen, Chinchen Chang, Richard C. T. Lee, D. Lin","doi":"10.1109/CMPSAC.1996.544601","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544601","url":null,"abstract":"We are concerned with the problem of designing optimal linear hashing files for orthogonal range retrieval. Through the study of performance expressions, we show that optimal basic linear hashing files and optimal recursive linear hashing files for orthogonal range retrieval can be produced, in certain cases, by a greedy method called the MMI (minimum marginal increase) method; and it is pointed out that optimal linear hashing files for partial match retrieval need not be optimal for orthogonal range retrieval.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130851676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544173
Karl R. P. H. Leung, Daniel K. C. Chan
Statecharts has been widely accepted as a successful graphical language for specifying reactive systems. However, some anomalies do exist when the durations of activities are subject to different interpretations such as with or without delay. A number of proposals have been put forward to address these anomalies but in a rather ad hoc fashion. The paper re-addresses these anomalies using a more uniform approach based on duration calculus. First, many anomalies are corrected by introducing new notations for specifying duration in statecharts. Second, the meanings of these notations are given in terms of duration calculus. Third, statecharts with duration can be subject to formal reasoning and hence verification. This framework delivers a more uniform extension to the graphical language as well as enables correctness with respect to duration to be studied formally using an established foundation.
{"title":"Extending statecharts with duration","authors":"Karl R. P. H. Leung, Daniel K. C. Chan","doi":"10.1109/CMPSAC.1996.544173","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544173","url":null,"abstract":"Statecharts has been widely accepted as a successful graphical language for specifying reactive systems. However, some anomalies do exist when the durations of activities are subject to different interpretations such as with or without delay. A number of proposals have been put forward to address these anomalies but in a rather ad hoc fashion. The paper re-addresses these anomalies using a more uniform approach based on duration calculus. First, many anomalies are corrected by introducing new notations for specifying duration in statecharts. Second, the meanings of these notations are given in terms of duration calculus. Third, statecharts with duration can be subject to formal reasoning and hence verification. This framework delivers a more uniform extension to the graphical language as well as enables correctness with respect to duration to be studied formally using an established foundation.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130441031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544163
Kyuwoong Lee, Seog Park
For real-time database systems, transaction processing must satisfy not only logical consistency constraints but also timing constraints. Conflict serializability is too restrictive to achieve the acceptable throughput and predictable response time. Moreover, serializability may not be necessary for concurrent execution and different correctness criteria may be applied to different applications depending on the semantics and the requirements of transactions. We classify the consistency into six forms and propose a relaxed serializability, called statewise serializability, as the weakest form of consistency in our classification. Statewise serializability alleviates the strictness of serializability by allowing for a controlled inconsistent read operation. It can be properly used as a correctness criterion in real-time database applications. We also present the algorithm that determines whether the schedules are statewise serializable, and compare it to other correctness criteria.
{"title":"Classification of weak correctness criteria for real-time database applications","authors":"Kyuwoong Lee, Seog Park","doi":"10.1109/CMPSAC.1996.544163","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544163","url":null,"abstract":"For real-time database systems, transaction processing must satisfy not only logical consistency constraints but also timing constraints. Conflict serializability is too restrictive to achieve the acceptable throughput and predictable response time. Moreover, serializability may not be necessary for concurrent execution and different correctness criteria may be applied to different applications depending on the semantics and the requirements of transactions. We classify the consistency into six forms and propose a relaxed serializability, called statewise serializability, as the weakest form of consistency in our classification. Statewise serializability alleviates the strictness of serializability by allowing for a controlled inconsistent read operation. It can be properly used as a correctness criterion in real-time database applications. We also present the algorithm that determines whether the schedules are statewise serializable, and compare it to other correctness criteria.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126378023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544177
Selee Na, Seog Park
The complexity of real applications in the field of intelligent information systems has required fuzzy data models for the expression and processing of uncertain and imprecise data. We propose a fuzzy association algebra (FA-algebra) as a query algebra for a new fuzzy object oriented data model (F-model). The F-model is investigated as fuzzy extensions of an object oriented data model, in which fuzzy objects and their fuzzy associations are represented. In FA-algebra, fuzzy objects and fuzzy associations are uniformly represented by fuzzy association patterns. As the results of operations by the operators defined in the FA-algebra, the returned fuzzy association patterns contain the truth values which mean the degrees of suitability of patterns as answers for the queries. The completeness of the FA-algebra is shown.
{"title":"A fuzzy association algebra based on a fuzzy object oriented data model","authors":"Selee Na, Seog Park","doi":"10.1109/CMPSAC.1996.544177","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544177","url":null,"abstract":"The complexity of real applications in the field of intelligent information systems has required fuzzy data models for the expression and processing of uncertain and imprecise data. We propose a fuzzy association algebra (FA-algebra) as a query algebra for a new fuzzy object oriented data model (F-model). The F-model is investigated as fuzzy extensions of an object oriented data model, in which fuzzy objects and their fuzzy associations are represented. In FA-algebra, fuzzy objects and fuzzy associations are uniformly represented by fuzzy association patterns. As the results of operations by the operators defined in the FA-algebra, the returned fuzzy association patterns contain the truth values which mean the degrees of suitability of patterns as answers for the queries. The completeness of the FA-algebra is shown.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133751761","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.545881
Eui-In Choi, Hae-Chull Lim
We propose improved techniques for logging, checkpointing, and recovery in a client-server database environment. The key features of our scheme are (1) the server only contains main-memory databases and a battery backup device log. (2) The clients perform checkpoint; this makes the server acknowledge the page dirtied by the client. (3) Restart processing is simple and made quicker by using the redo reduction technique. Finally, (4) database consistency is guaranteed even during checkpointing by improving fuzzy checkpointing.
{"title":"Recovery technique based on fuzzy checkpoint in a client/server database system","authors":"Eui-In Choi, Hae-Chull Lim","doi":"10.1109/CMPSAC.1996.545881","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.545881","url":null,"abstract":"We propose improved techniques for logging, checkpointing, and recovery in a client-server database environment. The key features of our scheme are (1) the server only contains main-memory databases and a battery backup device log. (2) The clients perform checkpoint; this makes the server acknowledge the page dirtied by the client. (3) Restart processing is simple and made quicker by using the redo reduction technique. Finally, (4) database consistency is guaranteed even during checkpointing by improving fuzzy checkpointing.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122948234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544610
P. Wong
Summary form only given. The Information Technology Institute (ITI) thrives on a research and development environment that is similar to many applied research institutes in the United States. It delivers projects that apply advanced and useful high technologies to difficult problems. Most of these projects have a fixed deadline, be it a launch date or a deployment date. Hence, the process supporting these projects needs to be sensitive to the management of scope and resources. The improvement process started in 1992, when ITI embarked on a journey to define an IS0 9000-compliant quality management system (QMS). I take a macro perspective towards deriving a process best suited to any environment. Instead of describing the processes that are in use in our organisation, I emphasise the participation of teams in the definition and deployment of the QMS.
{"title":"A team-based process improvement initiative","authors":"P. Wong","doi":"10.1109/CMPSAC.1996.544610","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544610","url":null,"abstract":"Summary form only given. The Information Technology Institute (ITI) thrives on a research and development environment that is similar to many applied research institutes in the United States. It delivers projects that apply advanced and useful high technologies to difficult problems. Most of these projects have a fixed deadline, be it a launch date or a deployment date. Hence, the process supporting these projects needs to be sensitive to the management of scope and resources. The improvement process started in 1992, when ITI embarked on a journey to define an IS0 9000-compliant quality management system (QMS). I take a macro perspective towards deriving a process best suited to any environment. Instead of describing the processes that are in use in our organisation, I emphasise the participation of teams in the definition and deployment of the QMS.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133820535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-19DOI: 10.1109/CMPSAC.1996.544612
I. Chung, M. Munro, W. Lee, Y. Kwon
We discuss how conventional testing criteria such as branch coverage can be applied to the testing of member functions inside a class. To support such testing techniques we employ symbolic execution techniques and finite state machines (FSMs). Symbolic execution is performed on the code of a member function to identify states that are required to fulfil a given criterion. We use FSMs to generate a sequence of member functions leading to the identified states. Our technique is a mixture of code-based and specification-based testing techniques in the sense that it uses information derived from codes using symbolic execution together with information from specifications using FSMs for testing activities.
{"title":"Applying conventional testing techniques for class testing","authors":"I. Chung, M. Munro, W. Lee, Y. Kwon","doi":"10.1109/CMPSAC.1996.544612","DOIUrl":"https://doi.org/10.1109/CMPSAC.1996.544612","url":null,"abstract":"We discuss how conventional testing criteria such as branch coverage can be applied to the testing of member functions inside a class. To support such testing techniques we employ symbolic execution techniques and finite state machines (FSMs). Symbolic execution is performed on the code of a member function to identify states that are required to fulfil a given criterion. We use FSMs to generate a sequence of member functions leading to the identified states. Our technique is a mixture of code-based and specification-based testing techniques in the sense that it uses information derived from codes using symbolic execution together with information from specifications using FSMs for testing activities.","PeriodicalId":306601,"journal":{"name":"Proceedings of 20th International Computer Software and Applications Conference: COMPSAC '96","volume":"477 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1996-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116016073","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}