Building embedded languages in Prolog is considered, with special attention given to expert system shells. The paradigm of metaprogramming of which building embedded languages is an example, is discussed. Interpreters for embedded languages are reviewed with emphasis on metainterpreters. Two applications, explanation and uncertainty reasoning are presented, and the techniques that were used in their construction are discussed.<>
{"title":"Building embedded languages and expert system shells in Prolog","authors":"L. U. Yalçinalp, L. Sterling","doi":"10.1109/TAI.1990.130310","DOIUrl":"https://doi.org/10.1109/TAI.1990.130310","url":null,"abstract":"Building embedded languages in Prolog is considered, with special attention given to expert system shells. The paradigm of metaprogramming of which building embedded languages is an example, is discussed. Interpreters for embedded languages are reviewed with emphasis on metainterpreters. Two applications, explanation and uncertainty reasoning are presented, and the techniques that were used in their construction are discussed.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123499691","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The authors present DIBBS, a domain-independent blackboard system that has been used for applications in distributed natural language processing (NLP). It is shown how the explicit representation of control knowledge as units on the DIBBS control blackboard allows an application to implement dynamic control strategies and reason about its own problem-solving performance. The DIBBS truth maintenance dependency network allows robust treatment of control issues in conflict resolution. All of these properties makes DIBBS especially appropriate for applications in distributed NLP, such as the DIOGENES and DIANA systems.<>
{"title":"The DIBBS blackboard control architecture and its application to distributed natural language processing","authors":"John R. R. Leavitt, Eric Nyberg","doi":"10.1109/TAI.1990.130335","DOIUrl":"https://doi.org/10.1109/TAI.1990.130335","url":null,"abstract":"The authors present DIBBS, a domain-independent blackboard system that has been used for applications in distributed natural language processing (NLP). It is shown how the explicit representation of control knowledge as units on the DIBBS control blackboard allows an application to implement dynamic control strategies and reason about its own problem-solving performance. The DIBBS truth maintenance dependency network allows robust treatment of control issues in conflict resolution. All of these properties makes DIBBS especially appropriate for applications in distributed NLP, such as the DIOGENES and DIANA systems.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121772679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
HILDA (high level design assistance) is a prototype of a rule-based configurer that operates in a CAD (computer-aided design) framework. The configurer in HILDA adopts the approach of generalized templates for configuring design models, i.e. it typically requires only one generalized template for each CAD tool instead of one template per design model, as in most other systems. A generalized template relieves the designers of creating and maintaining templates and yet facilitates integration of any arbitrary number of CAD tools. In particular, it supports dynamic changes of design structures in the design process by providing features such as keywords for traversing design hierarchies at run time, multiple rules for representing alternative configuring paths, backtracking for performing depth-first search and a flexible strategy for selecting versions.<>
{"title":"Configuration management in the HILDA system","authors":"L. Hsu","doi":"10.1109/TAI.1990.130326","DOIUrl":"https://doi.org/10.1109/TAI.1990.130326","url":null,"abstract":"HILDA (high level design assistance) is a prototype of a rule-based configurer that operates in a CAD (computer-aided design) framework. The configurer in HILDA adopts the approach of generalized templates for configuring design models, i.e. it typically requires only one generalized template for each CAD tool instead of one template per design model, as in most other systems. A generalized template relieves the designers of creating and maintaining templates and yet facilitates integration of any arbitrary number of CAD tools. In particular, it supports dynamic changes of design structures in the design process by providing features such as keywords for traversing design hierarchies at run time, multiple rules for representing alternative configuring paths, backtracking for performing depth-first search and a flexible strategy for selecting versions.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129744207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Issues of logic-based program transformation are discussed, and a method for transforming a source program expressed as a set of extended Horn clauses into a target program in an Algol-like procedural language is presented. The potential applications of this transformation method include (1) automatic synthesis of programs from design specifications which are either written in or translatable into extended Horn logic clauses, (2) adaptation of existing logic programs to a procedural execution environment in order to improve execution efficiency or facilitate reusability of the software, and (3) support of a hybrid-programming environment.<>
{"title":"Issues on deterministic transformation of logic-based program specification","authors":"J. M. Lin","doi":"10.1109/TAI.1990.130406","DOIUrl":"https://doi.org/10.1109/TAI.1990.130406","url":null,"abstract":"Issues of logic-based program transformation are discussed, and a method for transforming a source program expressed as a set of extended Horn clauses into a target program in an Algol-like procedural language is presented. The potential applications of this transformation method include (1) automatic synthesis of programs from design specifications which are either written in or translatable into extended Horn logic clauses, (2) adaptation of existing logic programs to a procedural execution environment in order to improve execution efficiency or facilitate reusability of the software, and (3) support of a hybrid-programming environment.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"298 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120868091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The authors describe a learning strategy motivated by computational constraints that enhances the speed of neural network learning. Decision regions in feature space are of three types: (1) well separated clusters (Type A). (2) disconnected clusters (Type B) and (3) clusters separated by complex boundaries (Type C). These decision regions have psychological validity, as is evident from E. Rosch's (1976) categorization theory. Rosch suggests that in taxonomies of real objects, there is one level of abstraction at which basic category cuts are made. Basic categories are similar to Type A clusters. Categories one level more abstract than basic categories are superordinate categories and categories one level less abstract are subordinate categories. These correspond to Type B and Type C clusters, respectively. It is proved that, in a binary valued feature space, basic categories can be learned by a perceptron. A two-layer network for classifying basic categories in a multi-valued feature space is described. This network is used as a basis to construct neural network STRUCT for learning superordinate and subordinate categories.<>
{"title":"Categorization in supervised neural network learning A computational approach","authors":"G. Krishnan, E. B. Reynolds","doi":"10.1109/TAI.1990.130340","DOIUrl":"https://doi.org/10.1109/TAI.1990.130340","url":null,"abstract":"The authors describe a learning strategy motivated by computational constraints that enhances the speed of neural network learning. Decision regions in feature space are of three types: (1) well separated clusters (Type A). (2) disconnected clusters (Type B) and (3) clusters separated by complex boundaries (Type C). These decision regions have psychological validity, as is evident from E. Rosch's (1976) categorization theory. Rosch suggests that in taxonomies of real objects, there is one level of abstraction at which basic category cuts are made. Basic categories are similar to Type A clusters. Categories one level more abstract than basic categories are superordinate categories and categories one level less abstract are subordinate categories. These correspond to Type B and Type C clusters, respectively. It is proved that, in a binary valued feature space, basic categories can be learned by a perceptron. A two-layer network for classifying basic categories in a multi-valued feature space is described. This network is used as a basis to construct neural network STRUCT for learning superordinate and subordinate categories.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122327350","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The authors present HSAS (heuristic search animation system), a tool which uses algorithm animation to aid the development of heuristics. HSAS has been used to analyze A* and study another search algorithm which uses islands or subgoals. It has been demonstrated that algorithm animation can be a useful tool when developing heuristics. The tool presented, HSAS, is a stepping stone into the arena of full-blown algorithm animation systems. When completed, HSAS will provide a complete environment for creating, debugging, and comparing heuristics.<>
{"title":"HSAS: a heuristic development tool","authors":"P. Nelson, J. Dillenburg, Lana Dubinsky","doi":"10.1109/TAI.1990.130384","DOIUrl":"https://doi.org/10.1109/TAI.1990.130384","url":null,"abstract":"The authors present HSAS (heuristic search animation system), a tool which uses algorithm animation to aid the development of heuristics. HSAS has been used to analyze A* and study another search algorithm which uses islands or subgoals. It has been demonstrated that algorithm animation can be a useful tool when developing heuristics. The tool presented, HSAS, is a stepping stone into the arena of full-blown algorithm animation systems. When completed, HSAS will provide a complete environment for creating, debugging, and comparing heuristics.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132674617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A specialized genetic algorithm for numerical optimization problems is described and the application of such algorithms to discrete-time optimal control problems is discussed. Numerical results obtained are compared with those obtained from a classical genetic algorithm and a system for construction and solution of large and complex mathematical programming models, GAMS. As this specialized algorithm is only a part of a proposed unified generic package, further extensions are also outlined.<>
{"title":"A specialized genetic algorithm for numerical optimization problems","authors":"C. Janikow, Z. Michalewicz","doi":"10.1109/TAI.1990.130441","DOIUrl":"https://doi.org/10.1109/TAI.1990.130441","url":null,"abstract":"A specialized genetic algorithm for numerical optimization problems is described and the application of such algorithms to discrete-time optimal control problems is discussed. Numerical results obtained are compared with those obtained from a classical genetic algorithm and a system for construction and solution of large and complex mathematical programming models, GAMS. As this specialized algorithm is only a part of a proposed unified generic package, further extensions are also outlined.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133563685","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Fourier transformation was applied on a set of typed text characters, extracting their unique features and developing an appropriate knowledge base for quick text character recognition. The use of this technique may also allow the development of an adaptive recognizer capable of learning through proper development of the classifier. The proposed technique computes the Fourier transform of the input string derived by the HVP (horizontal-vertical projection) process. In particular, the string created by the HVP scheme is a combination of two strings from the horizontal and vertical projections. The coefficients of the input string-derived Fourier series are compared with the features of the known characters, and classification is performed based on the closeness of the features set. Analysis of test results showed that the Fourier transform approach for feature extraction and the simple classification technique chosen in this project displayed a classification accuracy of over 80% for a limited set of conditions.<>
{"title":"Knowledge based text character recognition using Fourier transform","authors":"N. Bourbakis, A. T. Gumahad","doi":"10.1109/TAI.1990.130401","DOIUrl":"https://doi.org/10.1109/TAI.1990.130401","url":null,"abstract":"The Fourier transformation was applied on a set of typed text characters, extracting their unique features and developing an appropriate knowledge base for quick text character recognition. The use of this technique may also allow the development of an adaptive recognizer capable of learning through proper development of the classifier. The proposed technique computes the Fourier transform of the input string derived by the HVP (horizontal-vertical projection) process. In particular, the string created by the HVP scheme is a combination of two strings from the horizontal and vertical projections. The coefficients of the input string-derived Fourier series are compared with the features of the known characters, and classification is performed based on the closeness of the features set. Analysis of test results showed that the Fourier transform approach for feature extraction and the simple classification technique chosen in this project displayed a classification accuracy of over 80% for a limited set of conditions.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132101497","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
An implemented knowledge-based system tool, called JOSIE is presented, that allows specialized representations to be efficiently integrated into a uniform framework. As with previous systems of this type, JOSIE allows different kinds of knowledge to be represented differently. Specialized representations are used to improve an application system's efficiency. One way they gain efficiency is by representing some of their beliefs implicitly. The authors have identified two problems with integrating such specialized representations with a forward-directed rule system. The rule invocation problem is the problem of ensuring that rules are triggered from implicit beliefs. The implicit dependency problem is the problem of integrating dependency maintenance, performed by a specialized representation with a general-purpose truth maintenance system. A solution to these problems is presented for a broad class of specialized representations.<>
{"title":"Performing deduction from implicit beliefs","authors":"J. V. Baalen, Robert A. Nado","doi":"10.1109/TAI.1990.130448","DOIUrl":"https://doi.org/10.1109/TAI.1990.130448","url":null,"abstract":"An implemented knowledge-based system tool, called JOSIE is presented, that allows specialized representations to be efficiently integrated into a uniform framework. As with previous systems of this type, JOSIE allows different kinds of knowledge to be represented differently. Specialized representations are used to improve an application system's efficiency. One way they gain efficiency is by representing some of their beliefs implicitly. The authors have identified two problems with integrating such specialized representations with a forward-directed rule system. The rule invocation problem is the problem of ensuring that rules are triggered from implicit beliefs. The implicit dependency problem is the problem of integrating dependency maintenance, performed by a specialized representation with a general-purpose truth maintenance system. A solution to these problems is presented for a broad class of specialized representations.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127863260","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A conceptual framework and a system model for an intelligent assistant for requirement definition, KB/RMS, is presented. The requirement definition process is characterised by the requirements context model. Informal and formal methods for requirement definition are considered in light of this model, which serves as the logical schema for the KB/RMS database. Conventional and knowledge-based system support for requirement definition is summarized. The use of natural language processing, a semantic model of the problem and solution spaces, domain and technology models, and inference-driven augmentation, validation, and verification of the semantic model is discussed. Production of design representations from the augmented semantic model is covered.<>
{"title":"KB/RMS: an intelligent assistant for requirement definition","authors":"R. Binder, J. Tsai","doi":"10.1109/TAI.1990.130407","DOIUrl":"https://doi.org/10.1109/TAI.1990.130407","url":null,"abstract":"A conceptual framework and a system model for an intelligent assistant for requirement definition, KB/RMS, is presented. The requirement definition process is characterised by the requirements context model. Informal and formal methods for requirement definition are considered in light of this model, which serves as the logical schema for the KB/RMS database. Conventional and knowledge-based system support for requirement definition is summarized. The use of natural language processing, a semantic model of the problem and solution spaces, domain and technology models, and inference-driven augmentation, validation, and verification of the semantic model is discussed. Production of design representations from the augmented semantic model is covered.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116687951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}