{"title":"人工神经网络:综述","authors":"Jaswinder Kaur, Neha Gupta","doi":"10.30780/specialissue-icaccg2020/007","DOIUrl":null,"url":null,"abstract":"DOI Number: https://doi.org/10.30780/specialissue-ICACCG2020/007 pg.1 Paper Id: IJTRS-ICACCG2020-007 @2017, IJTRS All Right Reserved, www.ijtrs.com ARTIFICIAL NEURAL NETWORK: A REVIEW Jaswinder Kaur, Neha Gupta E-Mail Id: jasukaur@rediffmail.com, nehagupta@ansaluniversity.edu.in School of Engineering & Technology, Ansal University, Gurgaon, India AbstractIn this paper an introduction of Artificial Neural Network is presented. Learning Algorithms like Supervised Algorithms, Reinforcement Algorithms and Unsupervised Algorithms are discussed. Also, optimization methods like Gradient Descent, Newton Method, Conjugate Gradient Method, Quasi Newton and Levenberg Marquardt are presented.","PeriodicalId":302312,"journal":{"name":"International Journal of Technical Research & Science","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ARTIFICIAL NEURAL NETWORK: A REVIEW\",\"authors\":\"Jaswinder Kaur, Neha Gupta\",\"doi\":\"10.30780/specialissue-icaccg2020/007\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"DOI Number: https://doi.org/10.30780/specialissue-ICACCG2020/007 pg.1 Paper Id: IJTRS-ICACCG2020-007 @2017, IJTRS All Right Reserved, www.ijtrs.com ARTIFICIAL NEURAL NETWORK: A REVIEW Jaswinder Kaur, Neha Gupta E-Mail Id: jasukaur@rediffmail.com, nehagupta@ansaluniversity.edu.in School of Engineering & Technology, Ansal University, Gurgaon, India AbstractIn this paper an introduction of Artificial Neural Network is presented. Learning Algorithms like Supervised Algorithms, Reinforcement Algorithms and Unsupervised Algorithms are discussed. Also, optimization methods like Gradient Descent, Newton Method, Conjugate Gradient Method, Quasi Newton and Levenberg Marquardt are presented.\",\"PeriodicalId\":302312,\"journal\":{\"name\":\"International Journal of Technical Research & Science\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-08-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Technical Research & Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.30780/specialissue-icaccg2020/007\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Technical Research & Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.30780/specialissue-icaccg2020/007","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
DOI号:https://doi.org/10.30780/specialissue-ICACCG2020/007 pg.1论文Id: IJTRS- icaccg2020 -007 @2017, IJTRS All rights Reserved, www.ijtrs.com ARTIFICIAL NEURAL NETWORK: A REVIEW Jaswinder Kaur, Neha Gupta E-Mail Id: jasukaur@rediffmail.com, nehagupta@ansaluniversity.edu.in印度古尔冈安萨尔大学工程与技术学院摘要本文介绍了人工神经网络。讨论了有监督算法、强化算法和无监督算法等学习算法。给出了梯度下降法、牛顿法、共轭梯度法、拟牛顿法和Levenberg Marquardt法等优化方法。
DOI Number: https://doi.org/10.30780/specialissue-ICACCG2020/007 pg.1 Paper Id: IJTRS-ICACCG2020-007 @2017, IJTRS All Right Reserved, www.ijtrs.com ARTIFICIAL NEURAL NETWORK: A REVIEW Jaswinder Kaur, Neha Gupta E-Mail Id: jasukaur@rediffmail.com, nehagupta@ansaluniversity.edu.in School of Engineering & Technology, Ansal University, Gurgaon, India AbstractIn this paper an introduction of Artificial Neural Network is presented. Learning Algorithms like Supervised Algorithms, Reinforcement Algorithms and Unsupervised Algorithms are discussed. Also, optimization methods like Gradient Descent, Newton Method, Conjugate Gradient Method, Quasi Newton and Levenberg Marquardt are presented.