Václav Blažej, Boris Klemz, Felix Klesen, Marie Diana Sieper, Alexander Wolff, Johannes Zink
The problem Level Planarity asks for a crossing-free drawing of a graph in the plane such that vertices are placed at prescribed y-coordinates (called levels) and such that every edge is realized as a y-monotone curve. In the variant Constrained Level Planarity (CLP), each level $y$ is equipped with a partial order $prec_y$ on its vertices and in the desired drawing the left-to-right order of vertices on level $y$ has to be a linear extension of $prec_y$. Ordered Level Planarity (OLP) corresponds to the special case of CLP where the given partial orders $prec_y$ are total orders. Previous results by Br"uckner and Rutter [SODA 2017] and Klemz and Rote [ACM Trans. Alg. 2019] state that both CLP and OLP are NP-hard even in severely restricted cases. In particular, they remain NP-hard even when restricted to instances whose width (the maximum number of vertices that may share a common level) is at most two. In this paper, we focus on the other dimension: we study the parameterized complexity of CLP and OLP with respect to the height (the number of levels). We show that OLP parameterized by the height is complete with respect to the complexity class XNLP, which was first studied by Elberfeld et al. [Algorithmica 2015] (under a different name) and recently made more prominent by Bodlaender et al. [FOCS 2021]. It contains all parameterized problems that can be solved nondeterministically in time $f(k) n^{O(1)}$ and space $f(k) log n$ (where $f$ is a computable function, $n$ is the input size, and $k$ is the parameter). If a problem is XNLP-complete, it lies in XP, but is W[$t$]-hard for every $t$. In contrast to the fact that OLP parameterized by the height lies in XP, it turns out that CLP is NP-hard even when restricted to instances of height 4. We complement this result by showing that CLP can be solved in polynomial time for instances of height at most 3.
{"title":"Constrained and Ordered Level Planarity Parameterized by the Number of Levels","authors":"Václav Blažej, Boris Klemz, Felix Klesen, Marie Diana Sieper, Alexander Wolff, Johannes Zink","doi":"arxiv-2403.13702","DOIUrl":"https://doi.org/arxiv-2403.13702","url":null,"abstract":"The problem Level Planarity asks for a crossing-free drawing of a graph in\u0000the plane such that vertices are placed at prescribed y-coordinates (called\u0000levels) and such that every edge is realized as a y-monotone curve. In the\u0000variant Constrained Level Planarity (CLP), each level $y$ is equipped with a\u0000partial order $prec_y$ on its vertices and in the desired drawing the\u0000left-to-right order of vertices on level $y$ has to be a linear extension of\u0000$prec_y$. Ordered Level Planarity (OLP) corresponds to the special case of CLP\u0000where the given partial orders $prec_y$ are total orders. Previous results by\u0000Br\"uckner and Rutter [SODA 2017] and Klemz and Rote [ACM Trans. Alg. 2019]\u0000state that both CLP and OLP are NP-hard even in severely restricted cases. In\u0000particular, they remain NP-hard even when restricted to instances whose width\u0000(the maximum number of vertices that may share a common level) is at most two.\u0000In this paper, we focus on the other dimension: we study the parameterized\u0000complexity of CLP and OLP with respect to the height (the number of levels). We show that OLP parameterized by the height is complete with respect to the\u0000complexity class XNLP, which was first studied by Elberfeld et al.\u0000[Algorithmica 2015] (under a different name) and recently made more prominent\u0000by Bodlaender et al. [FOCS 2021]. It contains all parameterized problems that\u0000can be solved nondeterministically in time $f(k) n^{O(1)}$ and space $f(k) log\u0000n$ (where $f$ is a computable function, $n$ is the input size, and $k$ is the\u0000parameter). If a problem is XNLP-complete, it lies in XP, but is W[$t$]-hard\u0000for every $t$. In contrast to the fact that OLP parameterized by the height lies in XP, it\u0000turns out that CLP is NP-hard even when restricted to instances of height 4. We\u0000complement this result by showing that CLP can be solved in polynomial time for\u0000instances of height at most 3.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"46 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140201018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Emily Fox, Amir Nayyeri, Jonathan James Perry, Benjamin Raichel
We define and investigate the Fr'{e}chet edit distance problem. Given two polygonal curves $pi$ and $sigma$ and a threshhold value $delta>0$, we seek the minimum number of edits to $sigma$ such that the Fr'{e}chet distance between the edited $sigma$ and $pi$ is at most $delta$. For the edit operations we consider three cases, namely, deletion of vertices, insertion of vertices, or both. For this basic problem we consider a number of variants. Specifically, we provide polynomial time algorithms for both discrete and continuous Fr'{e}chet edit distance variants, as well as hardness results for weak Fr'{e}chet edit distance variants.
{"title":"Fréchet Edit Distance","authors":"Emily Fox, Amir Nayyeri, Jonathan James Perry, Benjamin Raichel","doi":"arxiv-2403.12878","DOIUrl":"https://doi.org/arxiv-2403.12878","url":null,"abstract":"We define and investigate the Fr'{e}chet edit distance problem. Given two\u0000polygonal curves $pi$ and $sigma$ and a threshhold value $delta>0$, we seek\u0000the minimum number of edits to $sigma$ such that the Fr'{e}chet distance\u0000between the edited $sigma$ and $pi$ is at most $delta$. For the edit\u0000operations we consider three cases, namely, deletion of vertices, insertion of\u0000vertices, or both. For this basic problem we consider a number of variants.\u0000Specifically, we provide polynomial time algorithms for both discrete and\u0000continuous Fr'{e}chet edit distance variants, as well as hardness results for\u0000weak Fr'{e}chet edit distance variants.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"25 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140165761","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Let $P$ be a set of $m$ points in ${mathbb R}^2$, let $Sigma$ be a set of $n$ semi-algebraic sets of constant complexity in ${mathbb R}^2$, let $(S,+)$ be a semigroup, and let $w: P rightarrow S$ be a weight function on the points of $P$. We describe a randomized algorithm for computing $w(Pcapsigma)$ for every $sigmainSigma$ in overall expected time $O^*bigl( m^{frac{2s}{5s-4}}n^{frac{5s-6}{5s-4}} + m^{2/3}n^{2/3} + m + n bigr)$, where $s>0$ is a constant that bounds the maximum complexity of the regions of $Sigma$, and where the $O^*(cdot)$ notation hides subpolynomial factors. For $sge 3$, surprisingly, this bound is smaller than the best-known bound for answering $m$ such queries in an on-line manner. The latter takes $O^*(m^{frac{s}{2s-1}}n^{frac{2s-2}{2s-1}}+m+n)$ time. Let $Phi: Sigma times P rightarrow {0,1}$ be the Boolean predicate (of constant complexity) such that $Phi(sigma,p) = 1$ if $pinsigma$ and $0$ otherwise, and let $Sigmamathop{Phi} P = { (sigma,p) in Sigmatimes P mid Phi(sigma,p)=1}$. Our algorithm actually computes a partition ${mathcal B}_Phi$ of $Sigmamathop{Phi} P$ into bipartite cliques (bicliques) of size (i.e., sum of the sizes of the vertex sets of its bicliques) $O^*bigl( m^{frac{2s}{5s-4}}n^{frac{5s-6}{5s-4}} + m^{2/3}n^{2/3} + m + n bigr)$. It is straightforward to compute $w(Pcapsigma)$ for all $sigmain Sigma$ from ${mathcal B}_Phi$. Similarly, if $eta: Sigma rightarrow S$ is a weight function on the regions of $Sigma$, $sum_{sigmain Sigma: p in sigma} eta(sigma)$, for every point $pin P$, can be computed from ${mathcal B}_Phi$ in a straightforward manner. A recent work of Chan et al. solves the online version of this dual point enclosure problem within the same performance bound as our off-line solution. We also mention a few other applications of computing ${mathcal B}_Phi$.
{"title":"Semi-Algebraic Off-line Range Searching and Biclique Partitions in the Plane","authors":"Pankaj K. Agarwal, Esther Ezra, Micha Sharir","doi":"arxiv-2403.12276","DOIUrl":"https://doi.org/arxiv-2403.12276","url":null,"abstract":"Let $P$ be a set of $m$ points in ${mathbb R}^2$, let $Sigma$ be a set of\u0000$n$ semi-algebraic sets of constant complexity in ${mathbb R}^2$, let $(S,+)$\u0000be a semigroup, and let $w: P rightarrow S$ be a weight function on the points\u0000of $P$. We describe a randomized algorithm for computing $w(Pcapsigma)$ for\u0000every $sigmainSigma$ in overall expected time $O^*bigl(\u0000m^{frac{2s}{5s-4}}n^{frac{5s-6}{5s-4}} + m^{2/3}n^{2/3} + m + n bigr)$,\u0000where $s>0$ is a constant that bounds the maximum complexity of the regions of\u0000$Sigma$, and where the $O^*(cdot)$ notation hides subpolynomial factors. For\u0000$sge 3$, surprisingly, this bound is smaller than the best-known bound for\u0000answering $m$ such queries in an on-line manner. The latter takes\u0000$O^*(m^{frac{s}{2s-1}}n^{frac{2s-2}{2s-1}}+m+n)$ time. Let $Phi: Sigma times P rightarrow {0,1}$ be the Boolean predicate (of\u0000constant complexity) such that $Phi(sigma,p) = 1$ if $pinsigma$ and $0$\u0000otherwise, and let $Sigmamathop{Phi} P = { (sigma,p) in Sigmatimes P\u0000mid Phi(sigma,p)=1}$. Our algorithm actually computes a partition\u0000${mathcal B}_Phi$ of $Sigmamathop{Phi} P$ into bipartite cliques\u0000(bicliques) of size (i.e., sum of the sizes of the vertex sets of its\u0000bicliques) $O^*bigl( m^{frac{2s}{5s-4}}n^{frac{5s-6}{5s-4}} + m^{2/3}n^{2/3}\u0000+ m + n bigr)$. It is straightforward to compute $w(Pcapsigma)$ for all\u0000$sigmain Sigma$ from ${mathcal B}_Phi$. Similarly, if $eta: Sigma\u0000rightarrow S$ is a weight function on the regions of $Sigma$,\u0000$sum_{sigmain Sigma: p in sigma} eta(sigma)$, for every point $pin P$,\u0000can be computed from ${mathcal B}_Phi$ in a straightforward manner. A recent\u0000work of Chan et al. solves the online version of this dual point enclosure\u0000problem within the same performance bound as our off-line solution. We also\u0000mention a few other applications of computing ${mathcal B}_Phi$.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"24 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140165877","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rathish Das, Omrit Filtser, Matthew J. Katz, Joseph S. B. Mitchell
We propose precise notions of what it means to guard a domain "robustly", under a variety of models. While approximation algorithms for minimizing the number of (precise) point guards in a polygon is a notoriously challenging area of investigation, we show that imposing various degrees of robustness on the notion of visibility coverage leads to a more tractable (and realistic) problem for which we can provide approximation algorithms with constant factor guarantees.
{"title":"Robustly Guarding Polygons","authors":"Rathish Das, Omrit Filtser, Matthew J. Katz, Joseph S. B. Mitchell","doi":"arxiv-2403.11861","DOIUrl":"https://doi.org/arxiv-2403.11861","url":null,"abstract":"We propose precise notions of what it means to guard a domain \"robustly\",\u0000under a variety of models. While approximation algorithms for minimizing the\u0000number of (precise) point guards in a polygon is a notoriously challenging area\u0000of investigation, we show that imposing various degrees of robustness on the\u0000notion of visibility coverage leads to a more tractable (and realistic) problem\u0000for which we can provide approximation algorithms with constant factor\u0000guarantees.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"36 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140165875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Polynomial partitioning techniques have recently led to improved geometric data structures for a variety of fundamental problems related to semialgebraic range searching and intersection searching in 3D and higher dimensions (e.g., see [Agarwal, Aronov, Ezra, and Zahl, SoCG 2019; Ezra and Sharir, SoCG 2021; Agarwal, Aronov, Ezra, Katz, and Sharir, SoCG 2022]). They have also led to improved algorithms for offline versions of semialgebraic range searching in 2D, via lens-cutting [Sharir and Zahl (2017)]. In this paper, we show that these techniques can yield new data structures for a number of other 2D problems even for online queries: 1. Semialgebraic range stabbing. We present a data structure for $n$ semialgebraic ranges in 2D of constant description complexity with $O(n^{3/2+varepsilon})$ preprocessing time and space, so that we can count the number of ranges containing a query point in $O(n^{1/4+varepsilon})$ time, for an arbitrarily small constant $varepsilon>0$. 2. Ray shooting amid algebraic arcs. We present a data structure for $n$ algebraic arcs in 2D of constant description complexity with $O(n^{3/2+varepsilon})$ preprocessing time and space, so that we can find the first arc hit by a query (straight-line) ray in $O(n^{1/4+varepsilon})$ time. 3. Intersection counting amid algebraic arcs. We present a data structure for $n$ algebraic arcs in 2D of constant description complexity with $O(n^{3/2+varepsilon})$ preprocessing time and space, so that we can count the number of intersection points with a query algebraic arc of constant description complexity in $O(n^{1/2+varepsilon})$ time. In particular, this implies an $O(n^{3/2+varepsilon})$-time algorithm for counting intersections between two sets of $n$ algebraic arcs in 2D.
{"title":"Semialgebraic Range Stabbing, Ray Shooting, and Intersection Counting in the Plane","authors":"Timothy M. Chan, Pingan Cheng, Da Wei Zheng","doi":"arxiv-2403.12303","DOIUrl":"https://doi.org/arxiv-2403.12303","url":null,"abstract":"Polynomial partitioning techniques have recently led to improved geometric\u0000data structures for a variety of fundamental problems related to semialgebraic\u0000range searching and intersection searching in 3D and higher dimensions (e.g.,\u0000see [Agarwal, Aronov, Ezra, and Zahl, SoCG 2019; Ezra and Sharir, SoCG 2021;\u0000Agarwal, Aronov, Ezra, Katz, and Sharir, SoCG 2022]). They have also led to\u0000improved algorithms for offline versions of semialgebraic range searching in\u00002D, via lens-cutting [Sharir and Zahl (2017)]. In this paper, we show that\u0000these techniques can yield new data structures for a number of other 2D\u0000problems even for online queries: 1. Semialgebraic range stabbing. We present a data structure for $n$\u0000semialgebraic ranges in 2D of constant description complexity with\u0000$O(n^{3/2+varepsilon})$ preprocessing time and space, so that we can count the\u0000number of ranges containing a query point in $O(n^{1/4+varepsilon})$ time, for\u0000an arbitrarily small constant $varepsilon>0$. 2. Ray shooting amid algebraic arcs. We present a data structure for $n$\u0000algebraic arcs in 2D of constant description complexity with\u0000$O(n^{3/2+varepsilon})$ preprocessing time and space, so that we can find the\u0000first arc hit by a query (straight-line) ray in $O(n^{1/4+varepsilon})$ time. 3. Intersection counting amid algebraic arcs. We present a data structure for\u0000$n$ algebraic arcs in 2D of constant description complexity with\u0000$O(n^{3/2+varepsilon})$ preprocessing time and space, so that we can count the\u0000number of intersection points with a query algebraic arc of constant\u0000description complexity in $O(n^{1/2+varepsilon})$ time. In particular, this\u0000implies an $O(n^{3/2+varepsilon})$-time algorithm for counting intersections\u0000between two sets of $n$ algebraic arcs in 2D.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"8 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140166128","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nithin Parepally, Ainesh Chatterjee, Auguste Gezalyan, Hongyang Du, Sukrit Mangla, Kenny Wu, Sarah Hwang, David Mount
There are many structures, both classical and modern, involving convex polygonal geometries whose deeper understanding would be facilitated through interactive visualizations. The Ipe extensible drawing editor, developed by Otfried Cheong, is a widely used software system for generating geometric figures. One of its features is the capability to extend its functionality through programs called Ipelets. In this media submission, we showcase a collection of new Ipelets that construct a variety of geometric objects based on polygonal geometries. These include Macbeath regions, metric balls in the forward and reverse Funk distance, metric balls in the Hilbert metric, polar bodies, the minimum enclosing ball of a point set, and minimum spanning trees in both the Funk and Hilbert metrics. We also include a number of utilities on convex polygons, including union, intersection, subtraction, and Minkowski sum (previously implemented as a CGAL Ipelet). All of our Ipelets are programmed in Lua and are freely available.
{"title":"Ipelets for the Convex Polygonal Geometry","authors":"Nithin Parepally, Ainesh Chatterjee, Auguste Gezalyan, Hongyang Du, Sukrit Mangla, Kenny Wu, Sarah Hwang, David Mount","doi":"arxiv-2403.10033","DOIUrl":"https://doi.org/arxiv-2403.10033","url":null,"abstract":"There are many structures, both classical and modern, involving convex\u0000polygonal geometries whose deeper understanding would be facilitated through\u0000interactive visualizations. The Ipe extensible drawing editor, developed by\u0000Otfried Cheong, is a widely used software system for generating geometric\u0000figures. One of its features is the capability to extend its functionality\u0000through programs called Ipelets. In this media submission, we showcase a\u0000collection of new Ipelets that construct a variety of geometric objects based\u0000on polygonal geometries. These include Macbeath regions, metric balls in the\u0000forward and reverse Funk distance, metric balls in the Hilbert metric, polar\u0000bodies, the minimum enclosing ball of a point set, and minimum spanning trees\u0000in both the Funk and Hilbert metrics. We also include a number of utilities on\u0000convex polygons, including union, intersection, subtraction, and Minkowski sum\u0000(previously implemented as a CGAL Ipelet). All of our Ipelets are programmed in\u0000Lua and are freely available.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"23 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140154912","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The problem of packing unequal circles into a circular container stands as a classic and challenging optimization problem in computational geometry. This study introduces a suite of innovative and efficient methods to tackle this problem. Firstly, we present a novel layout-graph transformation method that represents configurations as graphs, together with an inexact hash method facilitating fast comparison of configurations for isomorphism or similarity. Leveraging these advancements, we propose an Iterative Solution-Hashing Search algorithm adept at circumventing redundant exploration through efficient configuration recording. Additionally, we introduce several enhancements to refine the optimization and search processes, including an adaptive adjacency maintenance method, an efficient vacancy detection technique, and a Voronoi-based locating method. Through comprehensive computational experiments across various benchmark instances, our algorithm demonstrates superior performance over existing state-of-the-art methods, showcasing remarkable applicability and versatility. Notably, our algorithm surpasses the best-known results for 56 out of 179 benchmark instances while achieving parity with the remaining instances.
{"title":"Solution-Hashing Search Based on Layout-Graph Transformation for Unequal Circle Packing","authors":"Jianrong Zhou, Jiyao He, Kun He","doi":"arxiv-2403.06211","DOIUrl":"https://doi.org/arxiv-2403.06211","url":null,"abstract":"The problem of packing unequal circles into a circular container stands as a\u0000classic and challenging optimization problem in computational geometry. This\u0000study introduces a suite of innovative and efficient methods to tackle this\u0000problem. Firstly, we present a novel layout-graph transformation method that\u0000represents configurations as graphs, together with an inexact hash method\u0000facilitating fast comparison of configurations for isomorphism or similarity.\u0000Leveraging these advancements, we propose an Iterative Solution-Hashing Search\u0000algorithm adept at circumventing redundant exploration through efficient\u0000configuration recording. Additionally, we introduce several enhancements to\u0000refine the optimization and search processes, including an adaptive adjacency\u0000maintenance method, an efficient vacancy detection technique, and a\u0000Voronoi-based locating method. Through comprehensive computational experiments\u0000across various benchmark instances, our algorithm demonstrates superior\u0000performance over existing state-of-the-art methods, showcasing remarkable\u0000applicability and versatility. Notably, our algorithm surpasses the best-known\u0000results for 56 out of 179 benchmark instances while achieving parity with the\u0000remaining instances.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140106160","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Let $d$ be a (well-behaved) shortest-path metric defined on a path-connected subset of $mathbb{R}^2$ and let $mathcal{D}={D_1,ldots,D_n}$ be a set of geodesic disks with respect to the metric $d$. We prove that $mathcal{G}^{times}(mathcal{D})$, the intersection graph of the disks in $mathcal{D}$, has a clique-based separator consisting of $O(n^{3/4+varepsilon})$ cliques. This significantly extends the class of objects whose intersection graphs have small clique-based separators. Our clique-based separator yields an algorithm for $q$-COLORING that runs in time $2^{O(n^{3/4+varepsilon})}$, assuming the boundaries of the disks $D_i$ can be computed in polynomial time. We also use our clique-based separator to obtain a simple, efficient, and almost exact distance oracle for intersection graphs of geodesic disks. Our distance oracle uses $O(n^{7/4+varepsilon})$ storage and can report the hop distance between any two nodes in $mathcal{G}^{times}(mathcal{D})$ in $O(n^{3/4+varepsilon})$ time, up to an additive error of one. So far, distance oracles with an additive error of one that use subquadratic storage and sublinear query time were not known for such general graph classes.
{"title":"A Clique-Based Separator for Intersection Graphs of Geodesic Disks in $mathbb{R}^2$","authors":"Boris Aronov, Mark de Berg, Leonidas Theocharous","doi":"arxiv-2403.04905","DOIUrl":"https://doi.org/arxiv-2403.04905","url":null,"abstract":"Let $d$ be a (well-behaved) shortest-path metric defined on a path-connected\u0000subset of $mathbb{R}^2$ and let $mathcal{D}={D_1,ldots,D_n}$ be a set of\u0000geodesic disks with respect to the metric $d$. We prove that\u0000$mathcal{G}^{times}(mathcal{D})$, the intersection graph of the disks in\u0000$mathcal{D}$, has a clique-based separator consisting of\u0000$O(n^{3/4+varepsilon})$ cliques. This significantly extends the class of\u0000objects whose intersection graphs have small clique-based separators. Our clique-based separator yields an algorithm for $q$-COLORING that runs in\u0000time $2^{O(n^{3/4+varepsilon})}$, assuming the boundaries of the disks $D_i$\u0000can be computed in polynomial time. We also use our clique-based separator to\u0000obtain a simple, efficient, and almost exact distance oracle for intersection\u0000graphs of geodesic disks. Our distance oracle uses $O(n^{7/4+varepsilon})$\u0000storage and can report the hop distance between any two nodes in\u0000$mathcal{G}^{times}(mathcal{D})$ in $O(n^{3/4+varepsilon})$ time, up to an\u0000additive error of one. So far, distance oracles with an additive error of one\u0000that use subquadratic storage and sublinear query time were not known for such\u0000general graph classes.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"8 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140097727","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Karl Bringmann, Frank Staals, Karol Węgrzycki, Geert van Wordragen
The Earth Mover's Distance is a popular similarity measure in several branches of computer science. It measures the minimum total edge length of a perfect matching between two point sets. The Earth Mover's Distance under Translation ($mathrm{EMDuT}$) is a translation-invariant version thereof. It minimizes the Earth Mover's Distance over all translations of one point set. For $mathrm{EMDuT}$ in $mathbb{R}^1$, we present an $widetilde{mathcal{O}}(n^2)$-time algorithm. We also show that this algorithm is nearly optimal by presenting a matching conditional lower bound based on the Orthogonal Vectors Hypothesis. For $mathrm{EMDuT}$ in $mathbb{R}^d$, we present an $widetilde{mathcal{O}}(n^{2d+2})$-time algorithm for the $L_1$ and $L_infty$ metric. We show that this dependence on $d$ is asymptotically tight, as an $n^{o(d)}$-time algorithm for $L_1$ or $L_infty$ would contradict the Exponential Time Hypothesis (ETH). Prior to our work, only approximation algorithms were known for these problems.
{"title":"Fine-Grained Complexity of Earth Mover's Distance under Translation","authors":"Karl Bringmann, Frank Staals, Karol Węgrzycki, Geert van Wordragen","doi":"arxiv-2403.04356","DOIUrl":"https://doi.org/arxiv-2403.04356","url":null,"abstract":"The Earth Mover's Distance is a popular similarity measure in several\u0000branches of computer science. It measures the minimum total edge length of a\u0000perfect matching between two point sets. The Earth Mover's Distance under\u0000Translation ($mathrm{EMDuT}$) is a translation-invariant version thereof. It\u0000minimizes the Earth Mover's Distance over all translations of one point set. For $mathrm{EMDuT}$ in $mathbb{R}^1$, we present an\u0000$widetilde{mathcal{O}}(n^2)$-time algorithm. We also show that this algorithm\u0000is nearly optimal by presenting a matching conditional lower bound based on the\u0000Orthogonal Vectors Hypothesis. For $mathrm{EMDuT}$ in $mathbb{R}^d$, we\u0000present an $widetilde{mathcal{O}}(n^{2d+2})$-time algorithm for the $L_1$ and\u0000$L_infty$ metric. We show that this dependence on $d$ is asymptotically tight,\u0000as an $n^{o(d)}$-time algorithm for $L_1$ or $L_infty$ would contradict the\u0000Exponential Time Hypothesis (ETH). Prior to our work, only approximation\u0000algorithms were known for these problems.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"53 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140072094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Let $mathcal{P}$ be a simple polygon with $m$ vertices and let $P$ be a set of $n$ points inside $mathcal{P}$. We prove that there exists, for any $varepsilon>0$, a set $mathcal{C} subset P$ of size $O(1/varepsilon^2)$ such that the following holds: for any query point $q$ inside the polygon $mathcal{P}$, the geodesic distance from $q$ to its furthest neighbor in $mathcal{C}$ is at least $1-varepsilon$ times the geodesic distance to its further neighbor in $P$. Thus the set $mathcal{C}$ can be used for answering $varepsilon$-approximate furthest-neighbor queries with a data structure whose storage requirement is independent of the size of $P$. The coreset can be constructed in $Oleft(frac{1}{varepsilon} left( nlog(1/varepsilon) + (n+m)log(n+m)right) right)$ time.
{"title":"A Coreset for Approximate Furthest-Neighbor Queries in a Simple Polygon","authors":"Mark de Berg, Leonidas Theocharous","doi":"arxiv-2403.04513","DOIUrl":"https://doi.org/arxiv-2403.04513","url":null,"abstract":"Let $mathcal{P}$ be a simple polygon with $m$ vertices and let $P$ be a set\u0000of $n$ points inside $mathcal{P}$. We prove that there exists, for any\u0000$varepsilon>0$, a set $mathcal{C} subset P$ of size $O(1/varepsilon^2)$\u0000such that the following holds: for any query point $q$ inside the polygon\u0000$mathcal{P}$, the geodesic distance from $q$ to its furthest neighbor in\u0000$mathcal{C}$ is at least $1-varepsilon$ times the geodesic distance to its\u0000further neighbor in $P$. Thus the set $mathcal{C}$ can be used for answering\u0000$varepsilon$-approximate furthest-neighbor queries with a data structure whose\u0000storage requirement is independent of the size of $P$. The coreset can be\u0000constructed in $Oleft(frac{1}{varepsilon} left( nlog(1/varepsilon) +\u0000(n+m)log(n+m)right) right)$ time.","PeriodicalId":501570,"journal":{"name":"arXiv - CS - Computational Geometry","volume":"34 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140072371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}