Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183511
Y.-G. Kim, S. March
Conceptual data modeling is used in a variety of IS activities, including enterprise modeling, system requirements determination, logical database design, and data oriented application development. Recently a number of empirical studies have examined conceptual data models. The results from these studies are fragmented, often statistically insignificant, and sometimes conflicting. The authors examine the current literature and explain why the results have been so inconclusive. They pose a theoretical basis for empirical research in this area and discuss issues related to external validity, measurement, and type three errors as they relate to this area of research. Finally, they summarize preliminary findings from a study organized according to this theoretical basis.<>
{"title":"Conceptual data modeling: assessing empirical studies","authors":"Y.-G. Kim, S. March","doi":"10.1109/HICSS.1992.183511","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183511","url":null,"abstract":"Conceptual data modeling is used in a variety of IS activities, including enterprise modeling, system requirements determination, logical database design, and data oriented application development. Recently a number of empirical studies have examined conceptual data models. The results from these studies are fragmented, often statistically insignificant, and sometimes conflicting. The authors examine the current literature and explain why the results have been so inconclusive. They pose a theoretical basis for empirical research in this area and discuss issues related to external validity, measurement, and type three errors as they relate to this area of research. Finally, they summarize preliminary findings from a study organized according to this theoretical basis.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125231959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183276
H. Gomaa
Describes an object-oriented domain analysis and modeling method for analyzing and modeling a family of systems. This method addresses the issues of how to represent an application domain by means of multiple views, and how to represent similarities and variations in the domain. The method also supports an approach for generating a target system specification from the domain model, given the requirements of an individual target system. The goal is to provide a more effective way of managing system evolution and addressing software reuse from a generation technology perspective. The method is illustrated by means of an example.<>
{"title":"An object-oriented domain analysis and modeling method for software reuse","authors":"H. Gomaa","doi":"10.1109/HICSS.1992.183276","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183276","url":null,"abstract":"Describes an object-oriented domain analysis and modeling method for analyzing and modeling a family of systems. This method addresses the issues of how to represent an application domain by means of multiple views, and how to represent similarities and variations in the domain. The method also supports an approach for generating a target system specification from the domain model, given the requirements of an individual target system. The goal is to provide a more effective way of managing system evolution and addressing software reuse from a generation technology perspective. The method is illustrated by means of an example.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122713805","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183353
M. Maybury, S. Belardo
An investigation is given on the application of knowledge-based reasoning to the task of strategic corporate planning. An implemented computer program, FIVE-FORCES, is described which models the five major forces affecting corporate strategy: customers, suppliers, new entrants, substitute products, and competitors. In addition, the effects of industry structure and the socioeconomic environment are examined. The knowledge engineering methodology employed in FIVE-FORCES is discussed and illustrated with specific examples. It is argued that the knowledge represented in the FIVE-FORCES expert system is more directly analogous to real-world situations than that encoded in traditional rule-based paradigms. As a result, the behavior of the system is more natural and perspicuous. More importantly, the underlying model can be tested and validated against a board of experts or test suite of situations. FIVE-FORCES is claimed to be a general knowledge engineering shell which can be used as a simulation tool to evaluate corporate-level strategic analysis in a myriad of industries and businesses.<>
{"title":"FIVE FORCES: a knowledge based simulation of strategic planning","authors":"M. Maybury, S. Belardo","doi":"10.1109/HICSS.1992.183353","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183353","url":null,"abstract":"An investigation is given on the application of knowledge-based reasoning to the task of strategic corporate planning. An implemented computer program, FIVE-FORCES, is described which models the five major forces affecting corporate strategy: customers, suppliers, new entrants, substitute products, and competitors. In addition, the effects of industry structure and the socioeconomic environment are examined. The knowledge engineering methodology employed in FIVE-FORCES is discussed and illustrated with specific examples. It is argued that the knowledge represented in the FIVE-FORCES expert system is more directly analogous to real-world situations than that encoded in traditional rule-based paradigms. As a result, the behavior of the system is more natural and perspicuous. More importantly, the underlying model can be tested and validated against a board of experts or test suite of situations. FIVE-FORCES is claimed to be a general knowledge engineering shell which can be used as a simulation tool to evaluate corporate-level strategic analysis in a myriad of industries and businesses.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122834877","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183209
M. Rosing, J.N. Thomas
The authors describe hardware to reduce message latency and hide message passing's existence on distributed memory multiprocessors. Existing message passing systems are visible to the programmer and require setup time for each message. The authors propose a system in which normal processor memory reads and writes cause a communications processor to send or receive messages as necessary to implement the read or write operation. To implement this a section of memory is typed, describing the actions needed when that memory is read or written. Communications processor setup commands provide tables giving the memory layout. This can include full/empty bit synchronization, counted writers synchronization, multiple recipients, broadcasting, and remote procedure call support. The authors provide a justification, a mechanism description, and a proposed hardware and software implementation outline.<>
{"title":"Reducing message latency by making message passing transparent","authors":"M. Rosing, J.N. Thomas","doi":"10.1109/HICSS.1992.183209","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183209","url":null,"abstract":"The authors describe hardware to reduce message latency and hide message passing's existence on distributed memory multiprocessors. Existing message passing systems are visible to the programmer and require setup time for each message. The authors propose a system in which normal processor memory reads and writes cause a communications processor to send or receive messages as necessary to implement the read or write operation. To implement this a section of memory is typed, describing the actions needed when that memory is read or written. Communications processor setup commands provide tables giving the memory layout. This can include full/empty bit synchronization, counted writers synchronization, multiple recipients, broadcasting, and remote procedure call support. The authors provide a justification, a mechanism description, and a proposed hardware and software implementation outline.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"185 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131541447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183409
B. Post
As groupwork gains recognition, emerging group support technologies raise questions about the merits of these systems relative to group performance and return on investment. Business case variables of efficiency, quality, effectiveness, customer satisfaction and decision-making are useful in measuring the potential contribution that group support technologies offer. The author presents findings from a recent field study that used business case concepts as its design approach. He explores the infrastructure development requirements for building a business case study. Such a framework is useful to business decision-makers and researchers interested in the deployment of these technologies in complex business environments.<>
{"title":"Building the business case for group support technology","authors":"B. Post","doi":"10.1109/HICSS.1992.183409","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183409","url":null,"abstract":"As groupwork gains recognition, emerging group support technologies raise questions about the merits of these systems relative to group performance and return on investment. Business case variables of efficiency, quality, effectiveness, customer satisfaction and decision-making are useful in measuring the potential contribution that group support technologies offer. The author presents findings from a recent field study that used business case concepts as its design approach. He explores the infrastructure development requirements for building a business case study. Such a framework is useful to business decision-makers and researchers interested in the deployment of these technologies in complex business environments.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121756864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183487
W. Hefley
Discusses an approach and issues regarding the use of intelligent adaptive user interfaces, coupled with hypermedia, to provide supportive learning situations for users within an instructional paradigm of cognitive apprenticeship. Existing models of user interaction are examined and extended with a supportive learning model of user interaction. The paper addresses learning opportunities provided by implementing this model of interaction using intelligent computer-human interaction, as well as identifies issues which are yet to be resolved in providing supportive learning situations within the apprenticeship learning paradigm.<>
{"title":"Apprenticeship instruction through adaptive human-computer interaction","authors":"W. Hefley","doi":"10.1109/HICSS.1992.183487","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183487","url":null,"abstract":"Discusses an approach and issues regarding the use of intelligent adaptive user interfaces, coupled with hypermedia, to provide supportive learning situations for users within an instructional paradigm of cognitive apprenticeship. Existing models of user interaction are examined and extended with a supportive learning model of user interaction. The paper addresses learning opportunities provided by implementing this model of interaction using intelligent computer-human interaction, as well as identifies issues which are yet to be resolved in providing supportive learning situations within the apprenticeship learning paradigm.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"210 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132591429","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183200
M. Aleksic, V. Milutinovic
The authors introduce a layered classification of visualization primitives, according to which graphics related functions are organized in four levels. The first level refers to drawing. That level is typically supported with dedicated graphics coprocessors, like TMS240xO. The second and third levels refer to handling of graphics objects in a window environment. Level four refers to multimedia integration. Hardware/software acceleration of functions on level two and three is the subject of this research. After a simulation-based statistical analysis of selected benchmark programs, four functions on levels two and three are selected as the most critically needing an acceleration. Algorithmic improvements are introduced for these four functions, and the details of their hardware and/or software support is explained. Results of an analytical analysis are also given.<>
{"title":"Architecture support for window environments","authors":"M. Aleksic, V. Milutinovic","doi":"10.1109/HICSS.1992.183200","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183200","url":null,"abstract":"The authors introduce a layered classification of visualization primitives, according to which graphics related functions are organized in four levels. The first level refers to drawing. That level is typically supported with dedicated graphics coprocessors, like TMS240xO. The second and third levels refer to handling of graphics objects in a window environment. Level four refers to multimedia integration. Hardware/software acceleration of functions on level two and three is the subject of this research. After a simulation-based statistical analysis of selected benchmark programs, four functions on levels two and three are selected as the most critically needing an acceleration. Algorithmic improvements are introduced for these four functions, and the details of their hardware and/or software support is explained. Results of an analytical analysis are also given.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132786276","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183494
Soushan Wu, Hsien-Chang Kuo
The paper provides a general framework regarding the setting up of a DSS for a bank loan from a comprehensive and practical point of view. This system takes into consideration the logical and physical factors, including the database and the model base formation and the procedures of system analysis, design and programming. This framework provides an incentive vehicle for the top managers to put in the experience and to accumulate or to restore the valuable cases in the systems.<>
{"title":"A framework of PC bank loan DSS","authors":"Soushan Wu, Hsien-Chang Kuo","doi":"10.1109/HICSS.1992.183494","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183494","url":null,"abstract":"The paper provides a general framework regarding the setting up of a DSS for a bank loan from a comprehensive and practical point of view. This system takes into consideration the logical and physical factors, including the database and the model base formation and the procedures of system analysis, design and programming. This framework provides an incentive vehicle for the top managers to put in the experience and to accumulate or to restore the valuable cases in the systems.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133202974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183369
M. Alexander, J. Elam, C. Wasala
The paper examines a number of theoretical perspectives that can be used to study the assimilation and impact of emerging information technologies (IT). The authors apply these theoretical perspectives to document image processing (DIP). The paper adopts an emergent perspective, constructing a conceptual framework that can be used to study both the process of implementation and the factors which affect it. The interplay of associated environmental, technological and behavioral factors in this implementation are also studied. DIP systems are an emerging information technology which can result in marked organizational changes, including changes in workflow, job descriptions and organizational form. Imaging's viability is established, but the exact conditions under which it flourishes remain obscure. This study informs both the use of the technology and the theory which addresses the assimilation of emerging information technologies.<>
{"title":"Multiple theoretical perspectives for studying the assimilation of emerging information technologies","authors":"M. Alexander, J. Elam, C. Wasala","doi":"10.1109/HICSS.1992.183369","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183369","url":null,"abstract":"The paper examines a number of theoretical perspectives that can be used to study the assimilation and impact of emerging information technologies (IT). The authors apply these theoretical perspectives to document image processing (DIP). The paper adopts an emergent perspective, constructing a conceptual framework that can be used to study both the process of implementation and the factors which affect it. The interplay of associated environmental, technological and behavioral factors in this implementation are also studied. DIP systems are an emerging information technology which can result in marked organizational changes, including changes in workflow, job descriptions and organizational form. Imaging's viability is established, but the exact conditions under which it flourishes remain obscure. This study informs both the use of the technology and the theory which addresses the assimilation of emerging information technologies.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133431321","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-01-07DOI: 10.1109/HICSS.1992.183144
Robert Y. Hou, G. Ganger, Y. Patt, C. Gimarc
For many years, processor cycle times have continued to increase at a very rapid rate. On top of this, advances in multiprocessor technology have allowed potential system performance to increase at an even faster rate. The result is that the performance of many of today's computer systems is limited by the I/O subsystem. In this paper, the authors attempt to do two things: (1) separate I/O space into three categories, based on their very different raisons d'etre and consequently very different characteristics, and (2) focus on the issues pertaining to improving the performance of one basic mechanism in the I/O subsystem, the magnetic disk. They do the former in order to set the framework of the I/O space. (If one is to improve the performance of the I/O subsystem, one, first has to understand the nature of I/O and not cloud one's efforts by treating I/O as one homogeneous structure.) They do the latter as the first step in dealing with the various mechanisms that make up the I/O space.<>
{"title":"Issues and problems in the I/O subsystem. I. The magnetic disk","authors":"Robert Y. Hou, G. Ganger, Y. Patt, C. Gimarc","doi":"10.1109/HICSS.1992.183144","DOIUrl":"https://doi.org/10.1109/HICSS.1992.183144","url":null,"abstract":"For many years, processor cycle times have continued to increase at a very rapid rate. On top of this, advances in multiprocessor technology have allowed potential system performance to increase at an even faster rate. The result is that the performance of many of today's computer systems is limited by the I/O subsystem. In this paper, the authors attempt to do two things: (1) separate I/O space into three categories, based on their very different raisons d'etre and consequently very different characteristics, and (2) focus on the issues pertaining to improving the performance of one basic mechanism in the I/O subsystem, the magnetic disk. They do the former in order to set the framework of the I/O space. (If one is to improve the performance of the I/O subsystem, one, first has to understand the nature of I/O and not cloud one's efforts by treating I/O as one homogeneous structure.) They do the latter as the first step in dealing with the various mechanisms that make up the I/O space.<<ETX>>","PeriodicalId":103288,"journal":{"name":"Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133600936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}