{"title":"利用遗传编程进化启发式蒙特卡洛树搜索吃豆人代理","authors":"Atif M. Alhejali, S. Lucas","doi":"10.1109/CIG.2013.6633639","DOIUrl":null,"url":null,"abstract":"Ms Pac-Man is one of the most challenging test beds in game artificial intelligence (AI). Genetic programming and Monte Carlo Tree Search (MCTS) have already been successful applied to several games including Pac-Man. In this paper, we use Monte Carlo Tree Search to create a Ms Pac-Man playing agent before using genetic programming to enhance its performance by evolving a new default policy to replace the random agent used in the simulations. The new agent with the evolved default policy was able to achieve an 18% increase on its average score over the agent with random default policy.","PeriodicalId":158902,"journal":{"name":"2013 IEEE Conference on Computational Inteligence in Games (CIG)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"40","resultStr":"{\"title\":\"Using genetic programming to evolve heuristics for a Monte Carlo Tree Search Ms Pac-Man agent\",\"authors\":\"Atif M. Alhejali, S. Lucas\",\"doi\":\"10.1109/CIG.2013.6633639\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Ms Pac-Man is one of the most challenging test beds in game artificial intelligence (AI). Genetic programming and Monte Carlo Tree Search (MCTS) have already been successful applied to several games including Pac-Man. In this paper, we use Monte Carlo Tree Search to create a Ms Pac-Man playing agent before using genetic programming to enhance its performance by evolving a new default policy to replace the random agent used in the simulations. The new agent with the evolved default policy was able to achieve an 18% increase on its average score over the agent with random default policy.\",\"PeriodicalId\":158902,\"journal\":{\"name\":\"2013 IEEE Conference on Computational Inteligence in Games (CIG)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"40\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 IEEE Conference on Computational Inteligence in Games (CIG)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIG.2013.6633639\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE Conference on Computational Inteligence in Games (CIG)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIG.2013.6633639","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using genetic programming to evolve heuristics for a Monte Carlo Tree Search Ms Pac-Man agent
Ms Pac-Man is one of the most challenging test beds in game artificial intelligence (AI). Genetic programming and Monte Carlo Tree Search (MCTS) have already been successful applied to several games including Pac-Man. In this paper, we use Monte Carlo Tree Search to create a Ms Pac-Man playing agent before using genetic programming to enhance its performance by evolving a new default policy to replace the random agent used in the simulations. The new agent with the evolved default policy was able to achieve an 18% increase on its average score over the agent with random default policy.