High-Quality Hypergraph Partitioning

Q2 Mathematics Journal of Experimental Algorithmics Pub Date : 2021-06-16 DOI:10.1145/3529090
Sebastian Schlag, Tobias Heuer, Lars Gottesbüren, Yaroslav Akhremtsev, Christian Schulz, P. Sanders
{"title":"High-Quality Hypergraph Partitioning","authors":"Sebastian Schlag, Tobias Heuer, Lars Gottesbüren, Yaroslav Akhremtsev, Christian Schulz, P. Sanders","doi":"10.1145/3529090","DOIUrl":null,"url":null,"abstract":"Hypergraphs are a generalization of graphs where edges (aka nets) are allowed to connect more than two vertices. They have a similarly wide range of applications as graphs. This article considers the fundamental and intensively studied problem of balanced hypergraph partitioning (BHP), which asks for partitioning the vertices into k disjoint blocks of bounded size while minimizing an objective function over the hyperedges. Here, we consider the two most commonly used objectives: the cut-net metric and the connectivity metric. We describe our open-source hypergraph partitioner KaHyPar which is based on the successful multi-level approach—driving it to the extreme of using one level for (almost) every vertex. Using carefully designed data structures and dynamic update techniques, this approach turns out to have a very good time–quality tradeoff. We present two preprocessing techniques—pin sparsification using locality-sensitive hashing (LSH) and community detection based on the Louvain algorithm. The community structure is used to guide the coarsening process that incrementally contracts vertices. Portfolio-based partitioning of the contracted hypergraph then already achieves a good initial solution. While reversing the contraction process, a combination of several refinement techniques achieves a good final partitioning. In particular, we support a highly-localized local search that can directly produce a k-way partitioning and complement this with flow-based techniques that take a more global view. Optionally, a memetic algorithm evolves a pool of solution candidates to an overall good solution. We evaluate KaHyPar for a large set of instances from a wide range of application domains. With respect to quality, KaHyPar outperforms all previously considered systems that can handle large hypergraphs such as hMETIS, PaToH, Mondriaan, or Zoltan. Somewhat surprisingly, to some extend, this even extends to graph partitioners such as KaHIP when considering the special case of graphs. KaHyPar is also faster than most of these systems except for PaToH which represents a different speed–quality tradeoff.","PeriodicalId":53707,"journal":{"name":"Journal of Experimental Algorithmics","volume":"27 1","pages":"1 - 39"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"46","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Experimental Algorithmics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3529090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 46

Abstract

Hypergraphs are a generalization of graphs where edges (aka nets) are allowed to connect more than two vertices. They have a similarly wide range of applications as graphs. This article considers the fundamental and intensively studied problem of balanced hypergraph partitioning (BHP), which asks for partitioning the vertices into k disjoint blocks of bounded size while minimizing an objective function over the hyperedges. Here, we consider the two most commonly used objectives: the cut-net metric and the connectivity metric. We describe our open-source hypergraph partitioner KaHyPar which is based on the successful multi-level approach—driving it to the extreme of using one level for (almost) every vertex. Using carefully designed data structures and dynamic update techniques, this approach turns out to have a very good time–quality tradeoff. We present two preprocessing techniques—pin sparsification using locality-sensitive hashing (LSH) and community detection based on the Louvain algorithm. The community structure is used to guide the coarsening process that incrementally contracts vertices. Portfolio-based partitioning of the contracted hypergraph then already achieves a good initial solution. While reversing the contraction process, a combination of several refinement techniques achieves a good final partitioning. In particular, we support a highly-localized local search that can directly produce a k-way partitioning and complement this with flow-based techniques that take a more global view. Optionally, a memetic algorithm evolves a pool of solution candidates to an overall good solution. We evaluate KaHyPar for a large set of instances from a wide range of application domains. With respect to quality, KaHyPar outperforms all previously considered systems that can handle large hypergraphs such as hMETIS, PaToH, Mondriaan, or Zoltan. Somewhat surprisingly, to some extend, this even extends to graph partitioners such as KaHIP when considering the special case of graphs. KaHyPar is also faster than most of these systems except for PaToH which represents a different speed–quality tradeoff.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
高质量Hypergraph分区
Hypergraph是允许边(也称为网)连接两个以上顶点的图的推广。它们与图形有着同样广泛的应用。本文考虑了平衡超图划分(BHP)的基本问题和深入研究的问题,该问题要求将顶点划分为有界大小的k个不相交块,同时最小化超边上的目标函数。在这里,我们考虑两个最常用的目标:割网度量和连接度量。我们描述了我们的开源超图分割器KaHyPar,它基于成功的多层次方法——将其推向了对(几乎)每个顶点使用一个层次的极端。使用精心设计的数据结构和动态更新技术,这种方法具有很好的时间-质量折衷。我们提出了两种预处理技术——使用位置敏感哈希(LSH)的pin稀疏化和基于Louvain算法的社区检测。社区结构用于指导逐渐收缩顶点的粗化过程。基于组合的收缩超图划分已经实现了一个很好的初始解决方案。在扭转收缩过程的同时,几种细化技术的组合实现了良好的最终划分。特别是,我们支持高度本地化的局部搜索,它可以直接产生k路划分,并用更全局的基于流的技术来补充这一点。可选地,模因算法将候选解决方案库进化为整体良好的解决方案。我们针对来自广泛应用领域的大量实例对KaHyPar进行了评估。在质量方面,KaHyPar优于所有以前考虑的可以处理大型超图的系统,如hMETIS、PaToH、Mondrian或Zoltan。令人惊讶的是,在某种程度上,当考虑图的特殊情况时,这甚至扩展到了像KaHIP这样的图分割器。KaHyPar也比大多数系统更快,除了PaToH,它代表了不同的速度-质量权衡。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Experimental Algorithmics
Journal of Experimental Algorithmics Mathematics-Theoretical Computer Science
CiteScore
3.10
自引率
0.00%
发文量
29
期刊介绍: The ACM JEA is a high-quality, refereed, archival journal devoted to the study of discrete algorithms and data structures through a combination of experimentation and classical analysis and design techniques. It focuses on the following areas in algorithms and data structures: ■combinatorial optimization ■computational biology ■computational geometry ■graph manipulation ■graphics ■heuristics ■network design ■parallel processing ■routing and scheduling ■searching and sorting ■VLSI design
期刊最新文献
Random projections for Linear Programming: an improved retrieval phase SAT-Boosted Tabu Search for Coloring Massive Graphs An Experimental Evaluation of Semidefinite Programming and Spectral Algorithms for Max Cut A constructive heuristic for the uniform capacitated vertex k-center problem Algorithms for Efficiently Computing Structural Anonymity in Complex Networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1