{"title":"DLGR: A Rule-Based Approach to Graph Replacement for Deep Learning","authors":"Enze Ma","doi":"10.1109/ICECCS54210.2022.00030","DOIUrl":null,"url":null,"abstract":"In deep learning libraries like TensorFlow, compu-tations are manually batched as computation graphs. Graph replacement is then an optimization that replaces one subgraph of a computation graph with another whilst keeping the graphs before and after replacement functionally equivalent. Meanwhile, in practice, it remains a challenge how graph replacements can be performed efficiently: graph replacement is usually conducted by human engineers, and thus it incurs many human efforts since a variety of deep learning models do exist and a number of model-specific replacements can be performed; the functionality equivalence of graphs before and after replacement is also not easy to guarantee. To tackle with this challenge, we introduce in this paper DLGR, a rule-based approach to graph replacement for deep learning. The core idea of DLGR is to define a set of replacement rules, each of which specifies the source and the tar-get graph patterns and constraints on graph replacement. Given a computation graph, DLGR then performs an iterative process of matching and replacing subgraphs in the source graph, and generates a replaced, and usually optimized computation graph. We conduct experiments to evaluate the capabilities of DLGR. The results clearly show the strengths of DLGR: compared with two existing graph replacement techniques, it provides with more replacement rules and saves engineers' development efforts in reducing up to 68 % lines of code.","PeriodicalId":344493,"journal":{"name":"2022 26th International Conference on Engineering of Complex Computer Systems (ICECCS)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 26th International Conference on Engineering of Complex Computer Systems (ICECCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICECCS54210.2022.00030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In deep learning libraries like TensorFlow, compu-tations are manually batched as computation graphs. Graph replacement is then an optimization that replaces one subgraph of a computation graph with another whilst keeping the graphs before and after replacement functionally equivalent. Meanwhile, in practice, it remains a challenge how graph replacements can be performed efficiently: graph replacement is usually conducted by human engineers, and thus it incurs many human efforts since a variety of deep learning models do exist and a number of model-specific replacements can be performed; the functionality equivalence of graphs before and after replacement is also not easy to guarantee. To tackle with this challenge, we introduce in this paper DLGR, a rule-based approach to graph replacement for deep learning. The core idea of DLGR is to define a set of replacement rules, each of which specifies the source and the tar-get graph patterns and constraints on graph replacement. Given a computation graph, DLGR then performs an iterative process of matching and replacing subgraphs in the source graph, and generates a replaced, and usually optimized computation graph. We conduct experiments to evaluate the capabilities of DLGR. The results clearly show the strengths of DLGR: compared with two existing graph replacement techniques, it provides with more replacement rules and saves engineers' development efforts in reducing up to 68 % lines of code.