The paper discusses the first-hand experiences of the author in developing and teaching a graduate level engineering course in adaptive optimization methods inspired by nature. The paper discusses course content, textbooks and supplementary written material, software and computer projects, and grading and evaluation. This course has encouraged many students to pursue research in evolutionary computation, tabu search or simulated annealing, however it is continually being modified to reflect the many changes occurring in the field.
{"title":"Experiences with teaching adaptive optimization to engineering graduate students","authors":"Alice E. Smith","doi":"10.1109/CEC.1999.785478","DOIUrl":"https://doi.org/10.1109/CEC.1999.785478","url":null,"abstract":"The paper discusses the first-hand experiences of the author in developing and teaching a graduate level engineering course in adaptive optimization methods inspired by nature. The paper discusses course content, textbooks and supplementary written material, software and computer projects, and grading and evaluation. This course has encouraged many students to pursue research in evolutionary computation, tabu search or simulated annealing, however it is continually being modified to reflect the many changes occurring in the field.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117015709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gerald G. Owenson, M. Amos, D. Hodgson, A. Gibbons
Complex natural processes may often be expressed in terms of networks of computational components, such as Boolean logic gates or artificial neurons. The interaction of biological molecules and the flow of information controlling the development and behaviour of organisms is particularly amenable to this approach, and these models are well established in the biological community. However, only relatively recently have papers appeared proposing the use of such systems to perform useful, human-defined tasks. Rather than merely using the network analogy as a convenient technique for clarifying our understanding of complex systems, it may now be possible to harness the power of such systems for the purposes of computation. We review several such proposals, focusing on the molecular implementation of fundamental computational elements.
{"title":"Molecular implementation of computational components","authors":"Gerald G. Owenson, M. Amos, D. Hodgson, A. Gibbons","doi":"10.1109/CEC.1999.782527","DOIUrl":"https://doi.org/10.1109/CEC.1999.782527","url":null,"abstract":"Complex natural processes may often be expressed in terms of networks of computational components, such as Boolean logic gates or artificial neurons. The interaction of biological molecules and the flow of information controlling the development and behaviour of organisms is particularly amenable to this approach, and these models are well established in the biological community. However, only relatively recently have papers appeared proposing the use of such systems to perform useful, human-defined tasks. Rather than merely using the network analogy as a convenient technique for clarifying our understanding of complex systems, it may now be possible to harness the power of such systems for the purposes of computation. We review several such proposals, focusing on the molecular implementation of fundamental computational elements.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"320 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116819006","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fine-honing the crossover operator to produce higher fitness children is shown to result in improved genetic search. To illustrate this, two new general-purpose crossover operators are described. These operators require more computation time than traditional crossover operators, but the number of fitness evaluations and the overall amount of time spent by the genetic algorithm (to obtain solutions of desired near-optimal quality) is reduced significantly.
{"title":"Crossover operators that improve offspring fitness","authors":"C. Mohan","doi":"10.1109/CEC.1999.782667","DOIUrl":"https://doi.org/10.1109/CEC.1999.782667","url":null,"abstract":"Fine-honing the crossover operator to produce higher fitness children is shown to result in improved genetic search. To illustrate this, two new general-purpose crossover operators are described. These operators require more computation time than traditional crossover operators, but the number of fitness evaluations and the overall amount of time spent by the genetic algorithm (to obtain solutions of desired near-optimal quality) is reduced significantly.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124468925","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Recent literature on genetic algorithms provides a controversial discussion on the efficiency of this particular class of randomized optimization procedures; despite several encouraging empirical results, recent theoretical analyses have argued that in most cases, the runtime behavior of genetic algorithms is increased by at least a factor of ln(n) with n denoting the number of parameters to be optimized. It has been argued that these inefficiencies are due to intrinsic resampling effects. As a result of these theoretical considerations, a deterministic genetic algorithm has been suggested as a theoretical concept. Since its proposition, informal discussions have been raised concerning some implementation details as well as efficacy issues. Since some implementation details are a bit tricky, this paper discusses some of them in a pseudo programming language similar to Pascal or C. In addition, this paper presents two possible variants in detail and compares their runtime behavior with another fairly established procedure, the breeder genetic algorithm. It turns out that on widely-used test functions, the deterministic variants scale strictly better. Furthermore, this paper discusses some specific fitness functions on which random algorithms yield better worst-ease expectations than deterministic algorithms; but both types require constant time on average, i.e., one function evaluation.
{"title":"The deterministic genetic algorithm: implementation details and some results","authors":"R. Salomon","doi":"10.1109/CEC.1999.782001","DOIUrl":"https://doi.org/10.1109/CEC.1999.782001","url":null,"abstract":"Recent literature on genetic algorithms provides a controversial discussion on the efficiency of this particular class of randomized optimization procedures; despite several encouraging empirical results, recent theoretical analyses have argued that in most cases, the runtime behavior of genetic algorithms is increased by at least a factor of ln(n) with n denoting the number of parameters to be optimized. It has been argued that these inefficiencies are due to intrinsic resampling effects. As a result of these theoretical considerations, a deterministic genetic algorithm has been suggested as a theoretical concept. Since its proposition, informal discussions have been raised concerning some implementation details as well as efficacy issues. Since some implementation details are a bit tricky, this paper discusses some of them in a pseudo programming language similar to Pascal or C. In addition, this paper presents two possible variants in detail and compares their runtime behavior with another fairly established procedure, the breeder genetic algorithm. It turns out that on widely-used test functions, the deterministic variants scale strictly better. Furthermore, this paper discusses some specific fitness functions on which random algorithms yield better worst-ease expectations than deterministic algorithms; but both types require constant time on average, i.e., one function evaluation.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128293539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Parameter optimization has been a prime target for evolutionary algorithms for a number of years. Genetic algorithms, evolution strategies, and evolutionary programming have dealt with a variety of nonlinear programming problems. There is a growing evidence that evolutionary algorithms are well suited for optimization of real valued multi-modal difficult functions of many variables. Despite this success story, there are still many open, interesting questions. One of them deals with a relationship between the recombination operators and the landscape of the problem; it seems that different problems "require" different operators. We propose a new multi-parent crossover operator: a parabolic crossover, which works very well for certain types of landscapes. Additionally, this operator maintains an interesting balance between its exploratory and exploitative capabilities and has potential for further generalizations.
{"title":"A parabolic operator for parameter optimization problems","authors":"Thomas J. R. Stidsen, O. Caprani, Z. Michalewicz","doi":"10.1109/CEC.1999.782660","DOIUrl":"https://doi.org/10.1109/CEC.1999.782660","url":null,"abstract":"Parameter optimization has been a prime target for evolutionary algorithms for a number of years. Genetic algorithms, evolution strategies, and evolutionary programming have dealt with a variety of nonlinear programming problems. There is a growing evidence that evolutionary algorithms are well suited for optimization of real valued multi-modal difficult functions of many variables. Despite this success story, there are still many open, interesting questions. One of them deals with a relationship between the recombination operators and the landscape of the problem; it seems that different problems \"require\" different operators. We propose a new multi-parent crossover operator: a parabolic crossover, which works very well for certain types of landscapes. Additionally, this operator maintains an interesting balance between its exploratory and exploitative capabilities and has potential for further generalizations.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128633609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The study manipulated the neighborhood topologies of particle swarms optimizing four test functions. Several social network structures were tested, with "small-world" randomization of a specified number of links. Sociometric structure and the small-world manipulation interacted with function to produce a significant effect on performance.
{"title":"Small worlds and mega-minds: effects of neighborhood topology on particle swarm performance","authors":"J. Kennedy","doi":"10.1109/CEC.1999.785509","DOIUrl":"https://doi.org/10.1109/CEC.1999.785509","url":null,"abstract":"The study manipulated the neighborhood topologies of particle swarms optimizing four test functions. Several social network structures were tested, with \"small-world\" randomization of a specified number of links. Sociometric structure and the small-world manipulation interacted with function to produce a significant effect on performance.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130027629","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper evaluates the relevance of evolutionary artificial neural nets to forecasting the tick-by-tick DEM/USD exchange rate. With an analysis based on modern econometric techniques, this time series is shown to be a complex nonlinear series, and is qualified to be a challenge for ANNs and EANNs. Based on the five criteria, including the Sharpe ratio and a risk-adjusted profit rate, we compare the performance of 8 ANNs, 8 EANNs and the random-walk model. By the Granger-Newbold test, it is found that all neural network models can statistically beat the RW model in all criteria at the 1% significance level. In addition, among the 16 NN models generated in different designs, the best model is the EANN with the largest search space.
{"title":"Would evolutionary computation help in designs of ANNs in forecasting foreign exchange rates?","authors":"Shu-Heng Chen, Chun-Fen Lu","doi":"10.1109/CEC.1999.781935","DOIUrl":"https://doi.org/10.1109/CEC.1999.781935","url":null,"abstract":"This paper evaluates the relevance of evolutionary artificial neural nets to forecasting the tick-by-tick DEM/USD exchange rate. With an analysis based on modern econometric techniques, this time series is shown to be a complex nonlinear series, and is qualified to be a challenge for ANNs and EANNs. Based on the five criteria, including the Sharpe ratio and a risk-adjusted profit rate, we compare the performance of 8 ANNs, 8 EANNs and the random-walk model. By the Granger-Newbold test, it is found that all neural network models can statistically beat the RW model in all criteria at the 1% significance level. In addition, among the 16 NN models generated in different designs, the best model is the EANN with the largest search space.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125684289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The problem of time series prediction provides a practical benchmark for testing the performance of evolutionary algorithms. In this paper, we compare various selection methods for genetic programming, an evolutionary computation with variable-size tree representations, with application to time series data. Selection is an important operator that controls the dynamics of evolutionary computation. A number of selection operators have been so far proposed and tested in evolutionary algorithms with fixed-size chromosomes. However, the effect of selection schemes remains relatively unexplored in evolutionary algorithms with variable-size representations. We analyze the evolutionary dynamics of genetic programming by means of the selection to response and the selection differential proposed in the breeder genetic algorithm (BGA). The empirical analysis using the laser time-series data suggests that hard selection is more preferable than soft selection. This seems due to the lack of heritability in genetic programming.
{"title":"Effects of selection schemes in genetic programming for time series prediction","authors":"Jung-Jib Kim, Byoung-Tak Zhang","doi":"10.1109/CEC.1999.781933","DOIUrl":"https://doi.org/10.1109/CEC.1999.781933","url":null,"abstract":"The problem of time series prediction provides a practical benchmark for testing the performance of evolutionary algorithms. In this paper, we compare various selection methods for genetic programming, an evolutionary computation with variable-size tree representations, with application to time series data. Selection is an important operator that controls the dynamics of evolutionary computation. A number of selection operators have been so far proposed and tested in evolutionary algorithms with fixed-size chromosomes. However, the effect of selection schemes remains relatively unexplored in evolutionary algorithms with variable-size representations. We analyze the evolutionary dynamics of genetic programming by means of the selection to response and the selection differential proposed in the breeder genetic algorithm (BGA). The empirical analysis using the laser time-series data suggests that hard selection is more preferable than soft selection. This seems due to the lack of heritability in genetic programming.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130650843","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We have performed several experiments to study the possible use of chaos in simulated evolution. Chaos is often associated with dynamic situations in which there is feedback, hence there is speculation in the literature that chaos is a factor in natural evolution. We chose the iterated prisoner's dilemma problem as a test case, since it is a dynamic environment with feedback. To further illustrate the benefits of employing chaos in genetic algorithms, data derived from a genetic data clustering algorithm under development at the Idaho National Engineering and Environmental Laboratory is also presented. To perform an initial assessment of the use of chaos, we used the logistic function, a simple equation involving chaos, as the basis of a special mutation operator, which we call /spl lambda/ mutation. The behavior of the logistic function is well known and comprises three characteristic ranges of operation: convergent, bifurcating, and chaotic. We hypothesize that the chaotic regime will aid exploration, and the convergent range will aid exploitation. The bifurcating range is likely neutral, and hence an insignificant factor. Our results confirm these expectations.
{"title":"Using chaos in genetic algorithms","authors":"J. Determan, J. Foster","doi":"10.1109/CEC.1999.785533","DOIUrl":"https://doi.org/10.1109/CEC.1999.785533","url":null,"abstract":"We have performed several experiments to study the possible use of chaos in simulated evolution. Chaos is often associated with dynamic situations in which there is feedback, hence there is speculation in the literature that chaos is a factor in natural evolution. We chose the iterated prisoner's dilemma problem as a test case, since it is a dynamic environment with feedback. To further illustrate the benefits of employing chaos in genetic algorithms, data derived from a genetic data clustering algorithm under development at the Idaho National Engineering and Environmental Laboratory is also presented. To perform an initial assessment of the use of chaos, we used the logistic function, a simple equation involving chaos, as the basis of a special mutation operator, which we call /spl lambda/ mutation. The behavior of the logistic function is well known and comprises three characteristic ranges of operation: convergent, bifurcating, and chaotic. We hypothesize that the chaotic regime will aid exploration, and the convergent range will aid exploitation. The bifurcating range is likely neutral, and hence an insignificant factor. Our results confirm these expectations.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"6 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132858687","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The selection of program representation can have strong impact on the performance of genetic programming. Previous work has shown that a particular program representation which supports structure abstraction is very effective in solving the general even parity problem. We investigate program structures and analyze all perfect solutions in the search space to provide explanation of why structure abstraction is so effective with this problem. This work provides guidelines for the application of structure abstraction to other problems.
{"title":"Structure abstraction and genetic programming","authors":"Tina Yu","doi":"10.1109/CEC.1999.781995","DOIUrl":"https://doi.org/10.1109/CEC.1999.781995","url":null,"abstract":"The selection of program representation can have strong impact on the performance of genetic programming. Previous work has shown that a particular program representation which supports structure abstraction is very effective in solving the general even parity problem. We investigate program structures and analyze all perfect solutions in the search space to provide explanation of why structure abstraction is so effective with this problem. This work provides guidelines for the application of structure abstraction to other problems.","PeriodicalId":292523,"journal":{"name":"Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131326040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}