Methods are proposed which automatically extract a high level knowledge representation in the form of rules from the lower level representation used by neural networks. The strength of neural networks in dealing with noise has made it possible to produce correct rules in a noisy domain. Results obtained when applying the proposed method to a noisy domain suggest that this method can be used in real-world domains. It is believed that this work will lead to an area of machine learning which uses neural networks as the basis of knowledge acquisition which can deal with real-world difficulties.<>
{"title":"Machine learning using single-layered and multi-layered neural networks","authors":"S. Sestito, T. Dillon","doi":"10.1109/TAI.1990.130346","DOIUrl":"https://doi.org/10.1109/TAI.1990.130346","url":null,"abstract":"Methods are proposed which automatically extract a high level knowledge representation in the form of rules from the lower level representation used by neural networks. The strength of neural networks in dealing with noise has made it possible to produce correct rules in a noisy domain. Results obtained when applying the proposed method to a noisy domain suggest that this method can be used in real-world domains. It is believed that this work will lead to an area of machine learning which uses neural networks as the basis of knowledge acquisition which can deal with real-world difficulties.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130389403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Conventional and emerging neural approaches to fault-tolerant data retrieval when the input keyword and/or database itself may contain noise (errors) are reviewed. Spelling checking is used as a primary example to illustrate various approaches and to contrast the difference between conventional (algorithmic) techniques and research methods based on neural associative memories. Recent research on associative spelling checkers is summarized and some original results are presented. It is concluded that most neural models do not provide a viable solution for robust data retrieval due to saturation and scaling problems. However, a combination of conventional and neural approaches is shown to have excellent error correction rates and low computational costs; hence, it can be a good choice for robust data retrieval in large databases.<>
{"title":"Conventional and associative memory-based spelling checkers","authors":"V. Cherkassky, N. Vassilas, Gregory L. Brodt","doi":"10.1109/TAI.1990.130323","DOIUrl":"https://doi.org/10.1109/TAI.1990.130323","url":null,"abstract":"Conventional and emerging neural approaches to fault-tolerant data retrieval when the input keyword and/or database itself may contain noise (errors) are reviewed. Spelling checking is used as a primary example to illustrate various approaches and to contrast the difference between conventional (algorithmic) techniques and research methods based on neural associative memories. Recent research on associative spelling checkers is summarized and some original results are presented. It is concluded that most neural models do not provide a viable solution for robust data retrieval due to saturation and scaling problems. However, a combination of conventional and neural approaches is shown to have excellent error correction rates and low computational costs; hence, it can be a good choice for robust data retrieval in large databases.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"259 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114004128","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A probabilistic model for conflict-based diagnostic problem solving is presented. This model provides an objective measure for ranking hypotheses. Based on the probabilistic model, an algorithm which generates plausible diagnostic hypotheses in decreasing order of their probabilities is presented. Some possible uses of the algorithm are discussed.<>
{"title":"Hypothesis generation in conflict based diagnosis","authors":"Jiah-Shing Chen, S. Srihari","doi":"10.1109/TAI.1990.130353","DOIUrl":"https://doi.org/10.1109/TAI.1990.130353","url":null,"abstract":"A probabilistic model for conflict-based diagnostic problem solving is presented. This model provides an objective measure for ranking hypotheses. Based on the probabilistic model, an algorithm which generates plausible diagnostic hypotheses in decreasing order of their probabilities is presented. Some possible uses of the algorithm are discussed.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"158 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132524632","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Knowledge-based systems can be adapted interactively by non-programming experts to only a limited degree. The authors present an approach that delivers an additional layer of adaptability to such users. The added functionality allows the user to construct and specify the semantics of new domain terms, where these terms may refer to domain entities, characteristics or relations. The approach depends on strong-typing and extensive indexing of domain language terms. A system built on this approach exhibits a clear division of labor between the domain expert and a knowledge-base system programmer: the domain expert adds terms that describes particular domain elements while the programmer provides building blocks for a new term's semantics. The proposed approach has been implemented as part of a machine learning project in a system called KRAM.<>
{"title":"Interactive extension of domain datatypes","authors":"H. J. Antonisse","doi":"10.1109/TAI.1990.130332","DOIUrl":"https://doi.org/10.1109/TAI.1990.130332","url":null,"abstract":"Knowledge-based systems can be adapted interactively by non-programming experts to only a limited degree. The authors present an approach that delivers an additional layer of adaptability to such users. The added functionality allows the user to construct and specify the semantics of new domain terms, where these terms may refer to domain entities, characteristics or relations. The approach depends on strong-typing and extensive indexing of domain language terms. A system built on this approach exhibits a clear division of labor between the domain expert and a knowledge-base system programmer: the domain expert adds terms that describes particular domain elements while the programmer provides building blocks for a new term's semantics. The proposed approach has been implemented as part of a machine learning project in a system called KRAM.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130034216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A novel parallel architecture of a processor pipeline is proposed, comprising linearly connected processors via dual bank switchable memory blocks. A layered neural network with the back-propagating error algorithm is adopted as a benchmark test. The essential part of the algorithm is a matrix multiplication with a vector. An experimental system was implemented, and several measurements were made which demonstrate the suitability of the proposed architecture in some practical applications.<>
{"title":"Parallel computation of neural networks in a processor pipeline with partially shared memory","authors":"Y. Okawa, Takayuki Suyama","doi":"10.1109/TAI.1990.130347","DOIUrl":"https://doi.org/10.1109/TAI.1990.130347","url":null,"abstract":"A novel parallel architecture of a processor pipeline is proposed, comprising linearly connected processors via dual bank switchable memory blocks. A layered neural network with the back-propagating error algorithm is adopted as a benchmark test. The essential part of the algorithm is a matrix multiplication with a vector. An experimental system was implemented, and several measurements were made which demonstrate the suitability of the proposed architecture in some practical applications.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"119 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131561192","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The quality of an answer to a database query can be enhanced by utilizing semantic knowledge such that an expert would use if he/she were to answer the query. A system capable of utilizing such expert knowledge in a deductive database paradigm is presented. A given query is reformulated to reduce the cost of evaluation, to generate a subset of answers in the shortest amount of time, and/or to generate the set of good answers. Semantic knowledge is represented by a set of Horn clauses where a number of special predicates are used. The situations in which certain knowledge becomes relevant to a query are defined, and a method of identifying these situations is given. The authors present the ways in which semantic knowledge can be used and a backtracking scheme for inaccurate knowledge.<>
{"title":"Knowledge-directed query processing in expert database systems","authors":"Sang-goo Lee, L. Henschen, G. .. Qadah","doi":"10.1109/TAI.1990.130413","DOIUrl":"https://doi.org/10.1109/TAI.1990.130413","url":null,"abstract":"The quality of an answer to a database query can be enhanced by utilizing semantic knowledge such that an expert would use if he/she were to answer the query. A system capable of utilizing such expert knowledge in a deductive database paradigm is presented. A given query is reformulated to reduce the cost of evaluation, to generate a subset of answers in the shortest amount of time, and/or to generate the set of good answers. Semantic knowledge is represented by a set of Horn clauses where a number of special predicates are used. The situations in which certain knowledge becomes relevant to a query are defined, and a method of identifying these situations is given. The authors present the ways in which semantic knowledge can be used and a backtracking scheme for inaccurate knowledge.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114711790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The author describes experience with using the expert system shell G2 for a safety assessment and post-trip guidance system (SAS-II) intended for the control room of the Forsmark unit 2 nuclear power plant in Sweden. The experience is believed to be general enough to be relevant in a general discussion about expert system technology. In particular, real-time aspects and matters concerning data types are considered. The relationship between expert system software technology and traditional programming technology is also examined. The implementation of the SAS-II application was done without encountering serious problems.<>
{"title":"Experiences made using the expert system shell G2","authors":"S. Nilsen","doi":"10.1109/TAI.1990.130392","DOIUrl":"https://doi.org/10.1109/TAI.1990.130392","url":null,"abstract":"The author describes experience with using the expert system shell G2 for a safety assessment and post-trip guidance system (SAS-II) intended for the control room of the Forsmark unit 2 nuclear power plant in Sweden. The experience is believed to be general enough to be relevant in a general discussion about expert system technology. In particular, real-time aspects and matters concerning data types are considered. The relationship between expert system software technology and traditional programming technology is also examined. The implementation of the SAS-II application was done without encountering serious problems.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129396261","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The authors present a method of automatically constructing a model or a set of constraints from domain principles for solving the dynamic behavior of a mechanical system through qualitative simulation. The following issues are emphasized: (1) the extraction of a necessary and sufficient set of system constraints which provides the minimum uncertainty associated with qualitative simulation, and (2) the modification of system constraints through time by detecting and identifying 'system discontinuities'. The first is accomplished by describing a mechanical system by a collection of object and inter-connection primitives, which allows selection of the final set of constraints having the minimum complexity and uncertainty, based on the A* algorithm with heuristics providing the problem solving expertise. The second is accomplished by monitoring whether the states of any individual subsystems evolve to system discontinuities represented by intra/inter subsystem critical states.<>
{"title":"Building models for qualitative prediction of system dynamic behavior","authors":"Sukhan Lee, Judy Chen","doi":"10.1109/TAI.1990.130352","DOIUrl":"https://doi.org/10.1109/TAI.1990.130352","url":null,"abstract":"The authors present a method of automatically constructing a model or a set of constraints from domain principles for solving the dynamic behavior of a mechanical system through qualitative simulation. The following issues are emphasized: (1) the extraction of a necessary and sufficient set of system constraints which provides the minimum uncertainty associated with qualitative simulation, and (2) the modification of system constraints through time by detecting and identifying 'system discontinuities'. The first is accomplished by describing a mechanical system by a collection of object and inter-connection primitives, which allows selection of the final set of constraints having the minimum complexity and uncertainty, based on the A* algorithm with heuristics providing the problem solving expertise. The second is accomplished by monitoring whether the states of any individual subsystems evolve to system discontinuities represented by intra/inter subsystem critical states.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127259066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Koutsougeras, George M. Georgiou, C. Papachristou
The Athena model, introduced by C. Koutsougeras and C.A. Papachristou (1988), is a tree-like net whose adaptation is based on entropy optimization. The difficult problem in the optimization was handled by using Fisher's linear discriminant method. To handle the multiple class case, heuristics were used (multiple classes, generic classes) to reduce the problem at hand to the two-class case. In the present work, it is shown that the more general Fisher method of multiple discriminants is very effective in directly handling the multiple classes case. A method is also presented by which confidence values are produced for the overall classification decision.<>
由C. Koutsougeras和C. a . Papachristou(1988)提出的Athena模型是一个基于熵优化自适应的树状网络。利用Fisher线性判别法解决了优化中的难点问题。为了处理多类情况,采用启发式方法(多类、泛型类)将手头的问题简化为两类情况。在本工作中,证明了更一般的Fisher多重判别式方法对于直接处理多类情况是非常有效的。本文还提出了一种为总体分类决策产生置信值的方法。
{"title":"Extending Athena: multiple classes and confidence output values","authors":"C. Koutsougeras, George M. Georgiou, C. Papachristou","doi":"10.1109/TAI.1990.130425","DOIUrl":"https://doi.org/10.1109/TAI.1990.130425","url":null,"abstract":"The Athena model, introduced by C. Koutsougeras and C.A. Papachristou (1988), is a tree-like net whose adaptation is based on entropy optimization. The difficult problem in the optimization was handled by using Fisher's linear discriminant method. To handle the multiple class case, heuristics were used (multiple classes, generic classes) to reduce the problem at hand to the two-class case. In the present work, it is shown that the more general Fisher method of multiple discriminants is very effective in directly handling the multiple classes case. A method is also presented by which confidence values are produced for the overall classification decision.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130821894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The authors present a paradigm for automating programming based on how humans program. The paradigm involves top-down decomposition of a given problem into smaller problems using, if possible, programming cliches and analogies from previously solved problems till a primitive level is reached; at the primitive level the problem is solved by making calls to a library of pre-existing sub-routines that encode specific knowledge about the domain. The authors describe APU, which uses the above paradigm to automate UNIX programming, and focus on the knowledge structure and the problem-solving capability without the use of analogy.<>
{"title":"APU: an automatic programmer for UNIX","authors":"M. Harandi, S. Bhansali","doi":"10.1109/TAI.1990.130372","DOIUrl":"https://doi.org/10.1109/TAI.1990.130372","url":null,"abstract":"The authors present a paradigm for automating programming based on how humans program. The paradigm involves top-down decomposition of a given problem into smaller problems using, if possible, programming cliches and analogies from previously solved problems till a primitive level is reached; at the primitive level the problem is solved by making calls to a library of pre-existing sub-routines that encode specific knowledge about the domain. The authors describe APU, which uses the above paradigm to automate UNIX programming, and focus on the knowledge structure and the problem-solving capability without the use of analogy.<<ETX>>","PeriodicalId":366276,"journal":{"name":"[1990] Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127857658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}