Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.47990
H. D. Rombach, L. Mark
General requirements for software process specification languages are discussed. A first prototype software process specification language is presented, its application is demonstrated, and software-engineering-related requirements for a supporting information base are derived. Efforts aimed at implementing the information-base requirements are briefly mentioned. This work is part of the Meta Information Base project at the University of Maryland.<>
{"title":"Software process and product specifications: a basis for generating customized SE information bases","authors":"H. D. Rombach, L. Mark","doi":"10.1109/HICSS.1989.47990","DOIUrl":"https://doi.org/10.1109/HICSS.1989.47990","url":null,"abstract":"General requirements for software process specification languages are discussed. A first prototype software process specification language is presented, its application is demonstrated, and software-engineering-related requirements for a supporting information base are derived. Efforts aimed at implementing the information-base requirements are briefly mentioned. This work is part of the Meta Information Base project at the University of Maryland.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126813572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48005
S. Gjessing, E. Munthe-Kaas
A partial correctness proof method for a language with parallel programs and shared variables based on reasoning about process traces is presented. A main advantage of the approach is that properties of each process are first proved in isolation. The properties of the complete system are then found by using these process properties in a proof rule for parallel composition. This supports a modular construction and verification technique. A (mythical) trace variable is added to each process. When a Boolean expression is evaluated, a side effect is to record in the trace variable, the expression and its (Boolean) value. Write operations are also recorded in the trace. It is possible to reduce the amount of information recorded in the trace variable and hence make the proofs of weak properties even more manageable. An example verification is given.<>
{"title":"Trace based verification of parallel programs with shared variables","authors":"S. Gjessing, E. Munthe-Kaas","doi":"10.1109/HICSS.1989.48005","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48005","url":null,"abstract":"A partial correctness proof method for a language with parallel programs and shared variables based on reasoning about process traces is presented. A main advantage of the approach is that properties of each process are first proved in isolation. The properties of the complete system are then found by using these process properties in a proof rule for parallel composition. This supports a modular construction and verification technique. A (mythical) trace variable is added to each process. When a Boolean expression is evaluated, a side effect is to record in the trace variable, the expression and its (Boolean) value. Write operations are also recorded in the trace. It is possible to reduce the amount of information recorded in the trace variable and hence make the proofs of weak properties even more manageable. An example verification is given.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125130017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48090
E. Bertino, F. Marinaro
Two access methods are presented for text retrieval that are based on signature file techniques. Analytical formulas are presented that model access costs functions. The cost functions are presented for both magnetic and optical devices, in an environment where the text access methods are integrated with indexes.<>
{"title":"An evaluation of text access methods","authors":"E. Bertino, F. Marinaro","doi":"10.1109/HICSS.1989.48090","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48090","url":null,"abstract":"Two access methods are presented for text retrieval that are based on signature file techniques. Analytical formulas are presented that model access costs functions. The cost functions are presented for both magnetic and optical devices, in an environment where the text access methods are integrated with indexes.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123539874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48098
S. Henninger, A. Ignatowski, C. Rathke, D. Redmiles
A graphical editor design environment that incorporates and applies knowledge about application domains has been developed. The goal is to move a design environment closer toward its application. As an example of this generation of design support systems, a design environment for graphical editors in the domain of object-oriented inheritance networks is presented. In addition to the general knowledge about graphs, the system knows about inheritance mechanisms in object-oriented systems, and it knows about the nodes being classes and the links representing the superclass relation. This knowledge is used to provide guidance, critiques, and constraints.<>
{"title":"A knowledge-based design environment for graphical network editors","authors":"S. Henninger, A. Ignatowski, C. Rathke, D. Redmiles","doi":"10.1109/HICSS.1989.48098","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48098","url":null,"abstract":"A graphical editor design environment that incorporates and applies knowledge about application domains has been developed. The goal is to move a design environment closer toward its application. As an example of this generation of design support systems, a design environment for graphical editors in the domain of object-oriented inheritance networks is presented. In addition to the general knowledge about graphs, the system knows about inheritance mechanisms in object-oriented systems, and it knows about the nodes being classes and the links representing the superclass relation. This knowledge is used to provide guidance, critiques, and constraints.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122447935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48069
K.-J. Lin
A consistency model is presented of real-time database systems that distinguishes the external data consistency from the internal data consistency as maintained by traditional systems, to provide a transaction schedule to meet deadlines. External consistency requires that the data used by a transaction reflect the current physical environment; this is in contrast to internal consistency, which presents a view consistent with the predefined constraints of the database. It is suggested that external consistency is preferable to internal consistency for many transactions. Operationally consistent schedules are defined that emphasize the operational effect of databases to the external world. A protocol that ensures the external consistency of transactions is presented.<>
{"title":"Consistency issues in real-time database systems","authors":"K.-J. Lin","doi":"10.1109/HICSS.1989.48069","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48069","url":null,"abstract":"A consistency model is presented of real-time database systems that distinguishes the external data consistency from the internal data consistency as maintained by traditional systems, to provide a transaction schedule to meet deadlines. External consistency requires that the data used by a transaction reflect the current physical environment; this is in contrast to internal consistency, which presents a view consistent with the predefined constraints of the database. It is suggested that external consistency is preferable to internal consistency for many transactions. Operationally consistent schedules are defined that emphasize the operational effect of databases to the external world. A protocol that ensures the external consistency of transactions is presented.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130463003","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.47995
C. Kemerer
It is suggested that the complete lack of validated research demonstrating productivity benefits of computer-aided software engineering (CASE) tools is due to a number of inherent difficulties in the CASE evaluation process. A research agenda is set forth to address the shortfalls in managers' current ability to evaluate these tools. Managerial impacts that are commonly associated with CASE tools are described, and it is shown why measuring these impacts can be difficult. Additionally, some less commonly cited impacts are raised, and suggestions for research in these areas are made. The importance of models of software development to research in this area is discussed. Three popular research methodologies-experiments, field studies, and surveys-are described, and their limitations are examined.<>
{"title":"An agenda for research in the managerial evaluation of computer-aided software engineering (CASE) tool impacts","authors":"C. Kemerer","doi":"10.1109/HICSS.1989.47995","DOIUrl":"https://doi.org/10.1109/HICSS.1989.47995","url":null,"abstract":"It is suggested that the complete lack of validated research demonstrating productivity benefits of computer-aided software engineering (CASE) tools is due to a number of inherent difficulties in the CASE evaluation process. A research agenda is set forth to address the shortfalls in managers' current ability to evaluate these tools. Managerial impacts that are commonly associated with CASE tools are described, and it is shown why measuring these impacts can be difficult. Additionally, some less commonly cited impacts are raised, and suggestions for research in these areas are made. The importance of models of software development to research in this area is discussed. Three popular research methodologies-experiments, field studies, and surveys-are described, and their limitations are examined.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130472864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48064
N. Fenton
The Alvey Programme, sponsored by the UK government in 1983, encouraged academic and industrial institutions to work together to develop leading-edge research. One of the four major themes of the program was software engineering, which was further subdivided into (a) formal methods, (b) reliability and metrics, and (c) IPSEs (Integrated Project Support Environments). The author discusses one Alvey project that was unusual inasmuch as it encompassed both (a) and (b). The major objective of the project was to develop rigorous techniques for analyzing and measuring structural properties of systems. The author describes the achievements and failures of the project, the lessons to be learned, and how the very perception of software measurement changed fundamentally during the project. There are recommendations both for future research work in this area and for the nature of future collaborative projects.<>
{"title":"Software measurement and analysis: a case study in collaborative research","authors":"N. Fenton","doi":"10.1109/HICSS.1989.48064","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48064","url":null,"abstract":"The Alvey Programme, sponsored by the UK government in 1983, encouraged academic and industrial institutions to work together to develop leading-edge research. One of the four major themes of the program was software engineering, which was further subdivided into (a) formal methods, (b) reliability and metrics, and (c) IPSEs (Integrated Project Support Environments). The author discusses one Alvey project that was unusual inasmuch as it encompassed both (a) and (b). The major objective of the project was to develop rigorous techniques for analyzing and measuring structural properties of systems. The author describes the achievements and failures of the project, the lessons to be learned, and how the very perception of software measurement changed fundamentally during the project. There are recommendations both for future research work in this area and for the nature of future collaborative projects.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129594274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48091
R. Cordes, M. Hofmann, H. Langendorfer, R. Buck-Emden
A prototype of a multimedia document processing system called MuBIS (multimedia bureau information system), which integrates laser-optical disks, has been designed, developed, and implemented. It is tailored to office applications involving multimedia document management. Using an object-oriented approach enriched with the mechanisms of layering, aggregation, and decomposition, a system that satisfies the needs of the complex structure of its documents and the relationships between them has been built.<>
{"title":"The use of decomposition in an object-oriented approach to present and represent multimedia documents","authors":"R. Cordes, M. Hofmann, H. Langendorfer, R. Buck-Emden","doi":"10.1109/HICSS.1989.48091","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48091","url":null,"abstract":"A prototype of a multimedia document processing system called MuBIS (multimedia bureau information system), which integrates laser-optical disks, has been designed, developed, and implemented. It is tailored to office applications involving multimedia document management. Using an object-oriented approach enriched with the mechanisms of layering, aggregation, and decomposition, a system that satisfies the needs of the complex structure of its documents and the relationships between them has been built.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"131 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133861648","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48085
A. Partridge, A. Dekker
A scheme for adding speculative evaluation to the distributed implementation of a lazy functional language is presented. The scheme assigns reduced scheduling priorities to speculative computations to prevent them from overwhelming processing resources or altering the program's semantics. Scheduling priorities are dynamically adjusted during execution as speculative computations are found to be needed. By terminating computations associated with reclaimed pieces of graph, a distributed reference counting algorithm can be used to reclaim garbage nodes and to detect and terminate computations that are not required. A scheduling scheme and load balancing that operate in the presence of prioritised computations are briefly presented.<>
{"title":"Speculative parallelism in a distributed graph reduction machine","authors":"A. Partridge, A. Dekker","doi":"10.1109/HICSS.1989.48085","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48085","url":null,"abstract":"A scheme for adding speculative evaluation to the distributed implementation of a lazy functional language is presented. The scheme assigns reduced scheduling priorities to speculative computations to prevent them from overwhelming processing resources or altering the program's semantics. Scheduling priorities are dynamically adjusted during execution as speculative computations are found to be needed. By terminating computations associated with reclaimed pieces of graph, a distributed reference counting algorithm can be used to reclaim garbage nodes and to detect and terminate computations that are not required. A scheduling scheme and load balancing that operate in the presence of prioritised computations are briefly presented.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130043779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1989-01-03DOI: 10.1109/HICSS.1989.48002
B. J. Choi, R. DeMillo, E. W. Krauser, R. J. Martin, Aditya P. Mathur, A. J. Offutt, H. Pan, E. Spafford
Mothra is a software test environment that supports mutation-based testing of software systems. Mutation analysis is a powerful software testing technique that evaluates the adequacy of test data based on its ability to differentiate between the program under test and its mutants, where mutants are constructed by inserting single, simple errors into the program under test. This evaluation process also provides guidance in the creation of new test cases to provide more adequate testing. Mothra consists of a collection of individual tools, each of which implements a separate, independent function for the testing system. The initial Mothra tool set, for the most part, duplicates functionality existing in previous mutation analysis systems. Current efforts are concentrated on extending this basic tool set to include capabilities previously unavailable to the software testing community. The authors describe Mothra tool set and extensions planned for the future.<>
{"title":"The Mothra tool set (software testing)","authors":"B. J. Choi, R. DeMillo, E. W. Krauser, R. J. Martin, Aditya P. Mathur, A. J. Offutt, H. Pan, E. Spafford","doi":"10.1109/HICSS.1989.48002","DOIUrl":"https://doi.org/10.1109/HICSS.1989.48002","url":null,"abstract":"Mothra is a software test environment that supports mutation-based testing of software systems. Mutation analysis is a powerful software testing technique that evaluates the adequacy of test data based on its ability to differentiate between the program under test and its mutants, where mutants are constructed by inserting single, simple errors into the program under test. This evaluation process also provides guidance in the creation of new test cases to provide more adequate testing. Mothra consists of a collection of individual tools, each of which implements a separate, independent function for the testing system. The initial Mothra tool set, for the most part, duplicates functionality existing in previous mutation analysis systems. Current efforts are concentrated on extending this basic tool set to include capabilities previously unavailable to the software testing community. The authors describe Mothra tool set and extensions planned for the future.<<ETX>>","PeriodicalId":325958,"journal":{"name":"[1989] Proceedings of the Twenty-Second Annual Hawaii International Conference on System Sciences. Volume II: Software Track","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132196028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}