{"title":"极限学习机的改进Gram-Schmidt算法","authors":"Jianchuan Yin, Fang Dong, Nini Wang","doi":"10.1109/ISCID.2009.275","DOIUrl":null,"url":null,"abstract":"Extreme learning machine (ELM) has shown to be extremely fast with better generalization performance. The basic idea of ELM algorithm is to randomly choose the parameters of hidden nodes and then use simple generalized inverse operation to solve for the output weights of the network. Such a procedure faces two problems. First, ELM tends to require more random hidden nodes than conventional tuning-based algorithms. Second, subjectivity is involved in choosing appropriate number of random hidden nodes. In this paper, we propose an enhanced-ELM(en-ELM) algorithm by applying the modified Gram-Schmidt (MGS) method to select hidden nodes in random hidden nodes pool. Furthermore, enhanced-ELM uses the Akaike's final prediction error (FPE) criterion to automatically determine the number of random hidden nodes. In comparison with conventional ELM learning method on several commonly used regressor benchmark problems, enhanced-ELM algorithm can achieve compact network with much faster response and satisfactory accuracy.","PeriodicalId":294370,"journal":{"name":"International Symposium on Computational Intelligence and Design","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Modified Gram-Schmidt Algorithm for Extreme Learning Machine\",\"authors\":\"Jianchuan Yin, Fang Dong, Nini Wang\",\"doi\":\"10.1109/ISCID.2009.275\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Extreme learning machine (ELM) has shown to be extremely fast with better generalization performance. The basic idea of ELM algorithm is to randomly choose the parameters of hidden nodes and then use simple generalized inverse operation to solve for the output weights of the network. Such a procedure faces two problems. First, ELM tends to require more random hidden nodes than conventional tuning-based algorithms. Second, subjectivity is involved in choosing appropriate number of random hidden nodes. In this paper, we propose an enhanced-ELM(en-ELM) algorithm by applying the modified Gram-Schmidt (MGS) method to select hidden nodes in random hidden nodes pool. Furthermore, enhanced-ELM uses the Akaike's final prediction error (FPE) criterion to automatically determine the number of random hidden nodes. In comparison with conventional ELM learning method on several commonly used regressor benchmark problems, enhanced-ELM algorithm can achieve compact network with much faster response and satisfactory accuracy.\",\"PeriodicalId\":294370,\"journal\":{\"name\":\"International Symposium on Computational Intelligence and Design\",\"volume\":\"49 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-12-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Symposium on Computational Intelligence and Design\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISCID.2009.275\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Computational Intelligence and Design","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCID.2009.275","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
摘要
极限学习机(ELM)具有极快的学习速度和较好的泛化性能。ELM算法的基本思想是随机选择隐藏节点的参数,然后用简单的广义逆运算求解网络的输出权值。这样的程序面临两个问题。首先,与传统的基于调优的算法相比,ELM往往需要更多的随机隐藏节点。其次,选择适当数量的随机隐藏节点涉及主观性。本文采用改进的Gram-Schmidt (MGS)方法在随机隐藏节点池中选择隐藏节点,提出了一种增强的elm (en-ELM)算法。此外,增强elm使用赤池最终预测误差(Akaike’s final prediction error, FPE)准则自动确定随机隐藏节点的数量。在几种常用的回归量基准问题上,与传统的ELM学习方法相比,增强的ELM算法可以获得紧凑的网络,并且具有更快的响应速度和令人满意的精度。
Modified Gram-Schmidt Algorithm for Extreme Learning Machine
Extreme learning machine (ELM) has shown to be extremely fast with better generalization performance. The basic idea of ELM algorithm is to randomly choose the parameters of hidden nodes and then use simple generalized inverse operation to solve for the output weights of the network. Such a procedure faces two problems. First, ELM tends to require more random hidden nodes than conventional tuning-based algorithms. Second, subjectivity is involved in choosing appropriate number of random hidden nodes. In this paper, we propose an enhanced-ELM(en-ELM) algorithm by applying the modified Gram-Schmidt (MGS) method to select hidden nodes in random hidden nodes pool. Furthermore, enhanced-ELM uses the Akaike's final prediction error (FPE) criterion to automatically determine the number of random hidden nodes. In comparison with conventional ELM learning method on several commonly used regressor benchmark problems, enhanced-ELM algorithm can achieve compact network with much faster response and satisfactory accuracy.