TOM SCHRIJVERS, BIRTHE VAN DEN BERG, FABRIZIO RIGUZZI
{"title":"Prolog中的自动区分","authors":"TOM SCHRIJVERS, BIRTHE VAN DEN BERG, FABRIZIO RIGUZZI","doi":"10.1017/s1471068423000145","DOIUrl":null,"url":null,"abstract":"Abstract Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function’s (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.","PeriodicalId":49436,"journal":{"name":"Theory and Practice of Logic Programming","volume":"1 1","pages":"0"},"PeriodicalIF":1.4000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automatic Differentiation in Prolog\",\"authors\":\"TOM SCHRIJVERS, BIRTHE VAN DEN BERG, FABRIZIO RIGUZZI\",\"doi\":\"10.1017/s1471068423000145\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function’s (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.\",\"PeriodicalId\":49436,\"journal\":{\"name\":\"Theory and Practice of Logic Programming\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Theory and Practice of Logic Programming\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1017/s1471068423000145\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Theory and Practice of Logic Programming","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1017/s1471068423000145","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
Abstract Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function’s (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.
期刊介绍:
Theory and Practice of Logic Programming emphasises both the theory and practice of logic programming. Logic programming applies to all areas of artificial intelligence and computer science and is fundamental to them. Among the topics covered are AI applications that use logic programming, logic programming methodologies, specification, analysis and verification of systems, inductive logic programming, multi-relational data mining, natural language processing, knowledge representation, non-monotonic reasoning, semantic web reasoning, databases, implementations and architectures and constraint logic programming.