Tanvir Hossain, Esra Akbas, Muhammad Ifte Khairul Islam
{"title":"EnD: Enhanced Dedensification for Graph Compressing and Embedding","authors":"Tanvir Hossain, Esra Akbas, Muhammad Ifte Khairul Islam","doi":"10.1109/ICDMW58026.2022.00092","DOIUrl":null,"url":null,"abstract":"Graph representation learning is essential in applying machine learning methods on large-scale networks. Several embedding approaches have shown promising outcomes in recent years. Nonetheless, on massive graphs, it may be time-consuming and space inefficient for direct applications of existing embedding methods. This paper presents a novel graph compression approach based on dedensification called Enhanced Dedensification with degree-based compression (EnD). The principal goal of our system is to assure decent compression of large graphs that eloquently favor their representation learning. For this purpose, we first compress the low-degree nodes and dedensify them to reduce the high-degree nodes' loads. Then, we embed the compressed graph instead of the original graph to decrease the representation learning cost. Our approach is a general meta-strategy that attains time and space efficiency over the original graph by applying the state-of-the-art graph embedding methods: Node2vec, DeepWalk, RiWalk, and xNetMf. Comprehensive ex-periments on large-scale real-world graphs validate the viability of our method, which shows sound performance on single and multi-label node classification tasks without losing accuracy.","PeriodicalId":146687,"journal":{"name":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMW58026.2022.00092","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Graph representation learning is essential in applying machine learning methods on large-scale networks. Several embedding approaches have shown promising outcomes in recent years. Nonetheless, on massive graphs, it may be time-consuming and space inefficient for direct applications of existing embedding methods. This paper presents a novel graph compression approach based on dedensification called Enhanced Dedensification with degree-based compression (EnD). The principal goal of our system is to assure decent compression of large graphs that eloquently favor their representation learning. For this purpose, we first compress the low-degree nodes and dedensify them to reduce the high-degree nodes' loads. Then, we embed the compressed graph instead of the original graph to decrease the representation learning cost. Our approach is a general meta-strategy that attains time and space efficiency over the original graph by applying the state-of-the-art graph embedding methods: Node2vec, DeepWalk, RiWalk, and xNetMf. Comprehensive ex-periments on large-scale real-world graphs validate the viability of our method, which shows sound performance on single and multi-label node classification tasks without losing accuracy.