{"title":"An improved genetic algorithm for hydrological model calibration","authors":"Jungang Luo, Jiancang Xie, Yuxin Ma, Gang Zhang","doi":"10.1109/ICNC.2011.6022399","DOIUrl":null,"url":null,"abstract":"In order to overcome the disadvantages of quasi-genetic algorithm of slow convergence speed and premature convergence, an improved genetic algorithm of directional self-learning (DSLGA) is proposed in this paper. The directional information is introduced in local search process of the self-learning operator. And the search direction is guided by the pseudo-gradient of the function. By competition, cooperation and learning among the individuals, best solution is updated continuously. And a deletion operator is proposed in order to increase the population diversity, which avoid premature convergence and improve the algorithm convergence speed. Theoretical analysis has proved that DSLGA has the characteristic of global convergence. In experiment, DSLGA was tested by 5 unconstrained high-dimensional functions, and the results were compared with MAGA. Finally, the DSLGA was applied to optimal parameters estimation for Muskingum model, and compared with GAGA and MAGA. The experiment and application results show that DSLGA performs much better than the above algorithms both in quality of solutions and in computational complexity. So the effectiveness of algorithm is obvious.","PeriodicalId":299503,"journal":{"name":"2011 Seventh International Conference on Natural Computation","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 Seventh International Conference on Natural Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNC.2011.6022399","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In order to overcome the disadvantages of quasi-genetic algorithm of slow convergence speed and premature convergence, an improved genetic algorithm of directional self-learning (DSLGA) is proposed in this paper. The directional information is introduced in local search process of the self-learning operator. And the search direction is guided by the pseudo-gradient of the function. By competition, cooperation and learning among the individuals, best solution is updated continuously. And a deletion operator is proposed in order to increase the population diversity, which avoid premature convergence and improve the algorithm convergence speed. Theoretical analysis has proved that DSLGA has the characteristic of global convergence. In experiment, DSLGA was tested by 5 unconstrained high-dimensional functions, and the results were compared with MAGA. Finally, the DSLGA was applied to optimal parameters estimation for Muskingum model, and compared with GAGA and MAGA. The experiment and application results show that DSLGA performs much better than the above algorithms both in quality of solutions and in computational complexity. So the effectiveness of algorithm is obvious.