Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180798
M. Iwen, A. Mali
Significant advances in plan synthesis under classical assumptions have occurred in the last seven years. Such efficient planners are all centralized planners. One very major development among these is the Graphplan planner. Its popularity is clear from its several efficient adaptations/extensions. Since several practical planning problems are solved in a distributed manner it is important to adapt Graphplan to distributed planning. This involves dealing with significant challenges like decomposing the goal and set of actions without losing completeness. We report two sound two-agent planners DGP (distributed Graphplan) and IG-DGP (interaction graph-based DGP). Decomposition of goal and action set in DGP is carried out manually and in IG-DGP it is carried out automatically based on a new representation called interaction graphs. Our empirical evaluation shows that both these distributed planners are faster than Graphplan. IG-DGP is orders of magnitude faster than Graphplan. IG-DGP benefits significantly from interaction graphs which allow decomposition of a problem into fully independent subproblems under certain conditions. IG-DGP is a hybrid planner in which a centralized planner processes a problem until it becomes separable into two independent subproblems that are passed to a distributed planner This paper also shows that advances in centralized planning can significantly benefit distributed planners.
{"title":"Distributed Graphplan","authors":"M. Iwen, A. Mali","doi":"10.1109/TAI.2002.1180798","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180798","url":null,"abstract":"Significant advances in plan synthesis under classical assumptions have occurred in the last seven years. Such efficient planners are all centralized planners. One very major development among these is the Graphplan planner. Its popularity is clear from its several efficient adaptations/extensions. Since several practical planning problems are solved in a distributed manner it is important to adapt Graphplan to distributed planning. This involves dealing with significant challenges like decomposing the goal and set of actions without losing completeness. We report two sound two-agent planners DGP (distributed Graphplan) and IG-DGP (interaction graph-based DGP). Decomposition of goal and action set in DGP is carried out manually and in IG-DGP it is carried out automatically based on a new representation called interaction graphs. Our empirical evaluation shows that both these distributed planners are faster than Graphplan. IG-DGP is orders of magnitude faster than Graphplan. IG-DGP benefits significantly from interaction graphs which allow decomposition of a problem into fully independent subproblems under certain conditions. IG-DGP is a hybrid planner in which a centralized planner processes a problem until it becomes separable into two independent subproblems that are passed to a distributed planner This paper also shows that advances in centralized planning can significantly benefit distributed planners.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128184834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180824
L. Yin, A. Basu, Matt T. Yourst
This paper presents a new method to analyze and synthesize facial expressions, in which a spatio-temporal gradient based method (i.e., optical flow) is exploited to estimate the movement of facial feature points. We proposed a method (called motion correlation) to improve the conventional block correlation method for obtaining motion vectors. The tracking of facial expressions under an active camera is addressed. With the motion vectors estimated, a facial expression can be cloned by adjusting the existing 3D facial model, or synthesized using different facial models. The experimental results demonstrate that the approach proposed is feasible for applications such as low bit rate video coding and face animation.
{"title":"Active tracking and cloning of facial expressions using spatio-temporal information","authors":"L. Yin, A. Basu, Matt T. Yourst","doi":"10.1109/TAI.2002.1180824","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180824","url":null,"abstract":"This paper presents a new method to analyze and synthesize facial expressions, in which a spatio-temporal gradient based method (i.e., optical flow) is exploited to estimate the movement of facial feature points. We proposed a method (called motion correlation) to improve the conventional block correlation method for obtaining motion vectors. The tracking of facial expressions under an active camera is addressed. With the motion vectors estimated, a facial expression can be cloned by adjusting the existing 3D facial model, or synthesized using different facial models. The experimental results demonstrate that the approach proposed is feasible for applications such as low bit rate video coding and face animation.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114692270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180816
Jie Yang, C. Mohan, K. Mehrotra, P. Varshney
We have developed a tool that facilitates dynamically updating beliefs with time. This tool addresses directed probabilistic inference networks that may contain cycles, and takes into account the time delays associated with observations and decisions. Relevance of different observers may decay at different rates in the same application, and the belief in a hypothesis decays towards the associated prior probability. Simple models with few parameters have been implemented, with a user interface that facilitates changes to the structure and parameters of the graphical model, and associated conditional probabilities.
{"title":"A tool for belief updating over time in Bayesian networks","authors":"Jie Yang, C. Mohan, K. Mehrotra, P. Varshney","doi":"10.1109/TAI.2002.1180816","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180816","url":null,"abstract":"We have developed a tool that facilitates dynamically updating beliefs with time. This tool addresses directed probabilistic inference networks that may contain cycles, and takes into account the time delays associated with observations and decisions. Relevance of different observers may decay at different rates in the same application, and the belief in a hypothesis decays towards the associated prior probability. Simple models with few parameters have been implemented, with a user interface that facilitates changes to the structure and parameters of the graphical model, and associated conditional probabilities.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122020694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180841
Jun Zhang, Nengchao Wang, Feng Xiong
This paper proposes a novel watermarking scheme for an image, in which a logo watermark is embedded into the multiwavelet domain of the image using neural networks. The multiwavelet domain provides us with a multiresolution representation of the image like the scalar wavelet case. However, there are four subblocks in the coarsest level of the multiwavelet domain, where there is only one in that of the scalar wavelet domain, and also there is a great similarity among these subblocks. According to these characteristics of the multiwavelet domain, we embed a bit of the watermark by adjusting the polarity between the coefficient in one subblock and the mean value of the corresponding coefficients in other three subblocks. Furthermore, we use a back-propagation neural network (BPN) to learn the characteristics of relationship between the watermark and the watermarked image. Due to the learning and adaptive capabilities of the BPN, the false recovery of the watermark can be greatly reduced by the trained BPN. Experimental results show that the proposed method has good imperceptibility and high robustness to common image processing operators.
{"title":"Hiding a logo watermark into the multiwavelet domain using neural networks","authors":"Jun Zhang, Nengchao Wang, Feng Xiong","doi":"10.1109/TAI.2002.1180841","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180841","url":null,"abstract":"This paper proposes a novel watermarking scheme for an image, in which a logo watermark is embedded into the multiwavelet domain of the image using neural networks. The multiwavelet domain provides us with a multiresolution representation of the image like the scalar wavelet case. However, there are four subblocks in the coarsest level of the multiwavelet domain, where there is only one in that of the scalar wavelet domain, and also there is a great similarity among these subblocks. According to these characteristics of the multiwavelet domain, we embed a bit of the watermark by adjusting the polarity between the coefficient in one subblock and the mean value of the corresponding coefficients in other three subblocks. Furthermore, we use a back-propagation neural network (BPN) to learn the characteristics of relationship between the watermark and the watermarked image. Due to the learning and adaptive capabilities of the BPN, the false recovery of the watermark can be greatly reduced by the trained BPN. Experimental results show that the proposed method has good imperceptibility and high robustness to common image processing operators.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117013282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180847
X. Yu, A. Fin, F. Fummi, E. Rudnick
In order to reduce the time-to-market and simplify gate-level test generation for digital integrated circuits, GA-based functional test generation techniques are proposed for behavioral and register transfer level designs. The functional tests generated can be used for design verification, and they can also be reused at lower levels (i.e. register transfer and logic gate levels) for testability analysis and development. Experimental results demonstrate the effectiveness of the method in reducing the overall test generation time and increasing the gate-level fault coverage.
{"title":"A genetic testing framework for digital integrated circuits","authors":"X. Yu, A. Fin, F. Fummi, E. Rudnick","doi":"10.1109/TAI.2002.1180847","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180847","url":null,"abstract":"In order to reduce the time-to-market and simplify gate-level test generation for digital integrated circuits, GA-based functional test generation techniques are proposed for behavioral and register transfer level designs. The functional tests generated can be used for design verification, and they can also be reused at lower levels (i.e. register transfer and logic gate levels) for testability analysis and development. Experimental results demonstrate the effectiveness of the method in reducing the overall test generation time and increasing the gate-level fault coverage.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117316361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180828
Jianping Zhang, J. Bala, P. Barry, T. Meyer, S. Johnson
The Marine Corps' Project Albert seeks to model complex phenomenon by observing the behavior of relatively simple simulations over thousands of runs. These simulations are based upon lightweight agents, whose essential behavior has been distilled down to a small number of rules. By varying the parameters of these rules, Project Albert simulations can explore emergent complex nonlinear behaviors with the aim of developing insight not readily provided by first principle mathematical models. Thousands of runs of Albert simulation models create large amount of data that describe the association/correlation between the simulation input and output parameters. Understanding the associations between the simulation input and output parameters is critical to understanding the simulated complex phenomenon. This paper presents a data mining approach to analyzing the large scale and highly uncertain Albert simulation data. Specifically, a characteristic rule discovery algorithm is described in the paper together with its application to the Albert simulation runtime data.
{"title":"Mining characteristic rules for understanding simulation data","authors":"Jianping Zhang, J. Bala, P. Barry, T. Meyer, S. Johnson","doi":"10.1109/TAI.2002.1180828","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180828","url":null,"abstract":"The Marine Corps' Project Albert seeks to model complex phenomenon by observing the behavior of relatively simple simulations over thousands of runs. These simulations are based upon lightweight agents, whose essential behavior has been distilled down to a small number of rules. By varying the parameters of these rules, Project Albert simulations can explore emergent complex nonlinear behaviors with the aim of developing insight not readily provided by first principle mathematical models. Thousands of runs of Albert simulation models create large amount of data that describe the association/correlation between the simulation input and output parameters. Understanding the associations between the simulation input and output parameters is critical to understanding the simulated complex phenomenon. This paper presents a data mining approach to analyzing the large scale and highly uncertain Albert simulation data. Specifically, a characteristic rule discovery algorithm is described in the paper together with its application to the Albert simulation runtime data.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124983045","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180813
R. Paul, F. Bastani, Venkata U. B. Challagulla, I. Yen
The goal of accurate software measurement data analysis is to increase the understanding and improvement of software development process together with increased product quality and reliability. Several techniques have been proposed to enhance the reliability prediction of software systems using the stored measurement data, but no single method has proved to be completely effective. One of the critical parameters for software prediction systems is the size of the measurement data set, with large data sets providing better reliability estimates. In this paper, we propose a software defect classification method that allows defect data from multiple projects and multiple independent vendors to be combined together to obtain large data sets. We also show that once a sufficient amount of information has been collected, the memory-based reasoning technique can be applied to projects that are not in the analysis set to predict their reliabilities and guide their testing process. Finally, the result of applying this approach to the analysis of defect data generated from fault-injection simulation is presented.
{"title":"Software measurement data analysis using memory-based reasoning","authors":"R. Paul, F. Bastani, Venkata U. B. Challagulla, I. Yen","doi":"10.1109/TAI.2002.1180813","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180813","url":null,"abstract":"The goal of accurate software measurement data analysis is to increase the understanding and improvement of software development process together with increased product quality and reliability. Several techniques have been proposed to enhance the reliability prediction of software systems using the stored measurement data, but no single method has proved to be completely effective. One of the critical parameters for software prediction systems is the size of the measurement data set, with large data sets providing better reliability estimates. In this paper, we propose a software defect classification method that allows defect data from multiple projects and multiple independent vendors to be combined together to obtain large data sets. We also show that once a sufficient amount of information has been collected, the memory-based reasoning technique can be applied to projects that are not in the analysis set to predict their reliabilities and guide their testing process. Finally, the result of applying this approach to the analysis of defect data generated from fault-injection simulation is presented.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121482587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180795
E. Bayegan, Ø. Nytrø, A. Grimsmo
In contrast to existing patient-record systems, which merely offer static applications for storage and presentation, a helpful patient-record system is a problem-oriented, knowledge-based system, which provides clinicians with situation-dependent information. We propose a practical approach to extend the current data model with (1) means to recognize and interpret situations, (2) knowledge of how clinicians work and what information they need, and (3) means to rank information according to its relevance in a given care situation. Following the methodology of second-generation knowledge-based systems, that use ontologies to define fundamental concepts, their properties, and interrelationships within a particular domain, we present an ontology that supports three prerequisite features for a future helpful patient-record system: a family-care workflow process, a problem-oriented patient record, and means to identify relevant information to the care process and medical problems.
{"title":"Ontologies for knowledge representation in a computer-based patient record","authors":"E. Bayegan, Ø. Nytrø, A. Grimsmo","doi":"10.1109/TAI.2002.1180795","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180795","url":null,"abstract":"In contrast to existing patient-record systems, which merely offer static applications for storage and presentation, a helpful patient-record system is a problem-oriented, knowledge-based system, which provides clinicians with situation-dependent information. We propose a practical approach to extend the current data model with (1) means to recognize and interpret situations, (2) knowledge of how clinicians work and what information they need, and (3) means to rank information according to its relevance in a given care situation. Following the methodology of second-generation knowledge-based systems, that use ontologies to define fundamental concepts, their properties, and interrelationships within a particular domain, we present an ontology that supports three prerequisite features for a future helpful patient-record system: a family-care workflow process, a problem-oriented patient record, and means to identify relevant information to the care process and medical problems.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128308340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180835
B. Talbot, Bruce B. Whitehead, L. Talbot
This paper describes a technique for estimating business opportunity metrics from mixed numeric and categorical type databases by using a fuzzy Grade-of-Membership clustering model. The technique is applied to the problem of opportunity analysis for business decision-making. We propose two metrics called unfamiliarity and follow-on importance. Real business contract data are used to demonstrate the technique. This general approach could be adapted to many other applications where a decision agent needs to assess the value of items from a set of opportunities with respect to a reference set representing its business.
{"title":"Metric estimation via a fuzzy grade-of-membership model applied to analysis of business opportunities","authors":"B. Talbot, Bruce B. Whitehead, L. Talbot","doi":"10.1109/TAI.2002.1180835","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180835","url":null,"abstract":"This paper describes a technique for estimating business opportunity metrics from mixed numeric and categorical type databases by using a fuzzy Grade-of-Membership clustering model. The technique is applied to the problem of opportunity analysis for business decision-making. We propose two metrics called unfamiliarity and follow-on importance. Real business contract data are used to demonstrate the technique. This general approach could be adapted to many other applications where a decision agent needs to assess the value of items from a set of opportunities with respect to a reference set representing its business.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125554724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-04DOI: 10.1109/TAI.2002.1180830
B. Thuraisingham
This paper describes some ideas for a secure survivable semantic web that follows some of our previous ideas on dependable semantic web. Semantic web is a technology for understanding Web pages. It is important that the semantic web is secure. In addition, data exchanged by the Web has to be of high quality and survive failures and errors. The processes that the Web supports have to meet certain timing constraints. This paper discusses these aspects, and describes how they provide a dependable semantic web.
{"title":"Building secure survivable semantic webs","authors":"B. Thuraisingham","doi":"10.1109/TAI.2002.1180830","DOIUrl":"https://doi.org/10.1109/TAI.2002.1180830","url":null,"abstract":"This paper describes some ideas for a secure survivable semantic web that follows some of our previous ideas on dependable semantic web. Semantic web is a technology for understanding Web pages. It is important that the semantic web is secure. In addition, data exchanged by the Web has to be of high quality and survive failures and errors. The processes that the Web supports have to meet certain timing constraints. This paper discusses these aspects, and describes how they provide a dependable semantic web.","PeriodicalId":197064,"journal":{"name":"14th IEEE International Conference on Tools with Artificial Intelligence, 2002. (ICTAI 2002). Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130397601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}