Pub Date : 2018-03-01Epub Date: 2018-03-29DOI: 10.1142/s0218195918500012
Michelle Hatch Hummel, Bihua Yu, Carlos Simmerling, Evangelos A Coutsias
Explicit solvent molecular dynamics simulations of a macromolecule are slow as the number of solvent atoms considered typically increases by order of magnitude. Implicit methods introduce surface-dependent corrections to the force field, gaining speed at the expense of accuracy. Properties such as molecular interface surfaces, volumes and cavities are captured by Laguerre tessellations of macromolecules. However, Laguerre cells of exterior atoms tend to be overly large or unbounded. Our method, the inclusion-exclusion based Laguerre-Intersection method, caps cells in a physically accurate manner by considering the intersection of the space-filling diagram with the Laguerre tessellation. We optimize an adjustable parameter, the weight, to ensure the areas and volumes of capped cells exposed to solvent are as close as possible, on average, to those computed from equilibrated explicit solvent simulations. The contact planes are radical planes, meaning that as the solvent weight is varied, interior cells remain constant. We test the consistency of our model using a high-quality trajectory of HIV-protease, a dimer with flexible loops and open-close transitions. We also compare our results with interval-arithmetic Gauss-Bonnet based method. Optimal solvent parameters quickly converge, which we use to illustrate the increased fidelity of the Laguerre-Intersection method over two recently proposed methods as compared to the explicit model.
{"title":"LAGUERRE-INTERSECTION METHOD FOR IMPLICIT SOLVATION.","authors":"Michelle Hatch Hummel, Bihua Yu, Carlos Simmerling, Evangelos A Coutsias","doi":"10.1142/s0218195918500012","DOIUrl":"https://doi.org/10.1142/s0218195918500012","url":null,"abstract":"<p><p>Explicit solvent molecular dynamics simulations of a macromolecule are slow as the number of solvent atoms considered typically increases by order of magnitude. Implicit methods introduce surface-dependent corrections to the force field, gaining speed at the expense of accuracy. Properties such as molecular interface surfaces, volumes and cavities are captured by Laguerre tessellations of macromolecules. However, Laguerre cells of exterior atoms tend to be overly large or unbounded. Our method, the inclusion-exclusion based Laguerre-Intersection method, caps cells in a physically accurate manner by considering the intersection of the space-filling diagram with the Laguerre tessellation. We optimize an adjustable parameter, the weight, to ensure the areas and volumes of capped cells exposed to solvent are as close as possible, on average, to those computed from equilibrated explicit solvent simulations. The contact planes are radical planes, meaning that as the solvent weight is varied, interior cells remain constant. We test the consistency of our model using a high-quality trajectory of HIV-protease, a dimer with flexible loops and open-close transitions. We also compare our results with interval-arithmetic Gauss-Bonnet based method. Optimal solvent parameters quickly converge, which we use to illustrate the increased fidelity of the Laguerre-Intersection method over two recently proposed methods as compared to the explicit model.</p>","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"28 1","pages":"1-38"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1142/s0218195918500012","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37202633","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-03-01DOI: 10.4230/LIPIcs.SoCG.2018.56
M. V. Kreveld, M. Löffler, Lionov Wiratma
We revisit the classical polygonal line simplification problem and study it using the Hausdorff distance and Fr'echet distance. Interestingly, no previous authors studied line simplification under these measures in its pure form, namely: for a given $varepsilon$ > 0, choose a minimum size subsequence of the vertices of the input such that the Hausdorff or Fr'echet distance between the input and output polylines is at most $varepsilon$. We analyze how the well-known Douglas-Peucker and Imai-Iri simplification algorithms perform compared to the optimum possible, also in the situation where the algorithms are given a considerably larger error threshold than $varepsilon$. Furthermore, we show that computing an optimal simplification using the undirected Hausdorff distance is NP-hard. The same holds when using the directed Hausdorff distance from the input to the output polyline, whereas the reverse can be computed in polynomial time. Finally, to compute the optimal simplification from a polygonal line consisting of $n$ vertices under the Fr'echet distance, we give an $O(kn^5)$ time algorithm that requires $O(kn^2)$ space, where $k$ is the output complexity of the simplification.
{"title":"On Optimal Polyline Simplification using the Hausdorff and Fréchet Distance","authors":"M. V. Kreveld, M. Löffler, Lionov Wiratma","doi":"10.4230/LIPIcs.SoCG.2018.56","DOIUrl":"https://doi.org/10.4230/LIPIcs.SoCG.2018.56","url":null,"abstract":"We revisit the classical polygonal line simplification problem and study it using the Hausdorff distance and Fr'echet distance. Interestingly, no previous authors studied line simplification under these measures in its pure form, namely: for a given $varepsilon$ > 0, choose a minimum size subsequence of the vertices of the input such that the Hausdorff or Fr'echet distance between the input and output polylines is at most $varepsilon$. We analyze how the well-known Douglas-Peucker and Imai-Iri simplification algorithms perform compared to the optimum possible, also in the situation where the algorithms are given a considerably larger error threshold than $varepsilon$. Furthermore, we show that computing an optimal simplification using the undirected Hausdorff distance is NP-hard. The same holds when using the directed Hausdorff distance from the input to the output polyline, whereas the reverse can be computed in polynomial time. Finally, to compute the optimal simplification from a polygonal line consisting of $n$ vertices under the Fr'echet distance, we give an $O(kn^5)$ time algorithm that requires $O(kn^2)$ space, where $k$ is the output complexity of the simplification.","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"14 1","pages":"1-25"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81995657","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-02-28DOI: 10.4230/LIPIcs.SoCG.2018.26
F. Chazal, Vincent Divol
Persistence diagrams play a fundamental role in Topological Data Analysis where they are used as topological descriptors of filtrations built on top of data. They consist in discrete multisets of points in the plane $mathbb{R}^2$ that can equivalently be seen as discrete measures in $mathbb{R}^2$. When the data come as a random point cloud, these discrete measures become random measures whose expectation is studied in this paper. First, we show that for a wide class of filtrations, including the v{C}ech and Rips-Vietoris filtrations, the expected persistence diagram, that is a deterministic measure on $mathbb{R}^2$ , has a density with respect to the Lebesgue measure. Second, building on the previous result we show that the persistence surface recently introduced in [Adams & al., Persistence images: a stable vector representation of persistent homology] can be seen as a kernel estimator of this density. We propose a cross-validation scheme for selecting an optimal bandwidth, which is proven to be a consistent procedure to estimate the density.
{"title":"The density of expected persistence diagrams and its kernel based estimation","authors":"F. Chazal, Vincent Divol","doi":"10.4230/LIPIcs.SoCG.2018.26","DOIUrl":"https://doi.org/10.4230/LIPIcs.SoCG.2018.26","url":null,"abstract":"Persistence diagrams play a fundamental role in Topological Data Analysis where they are used as topological descriptors of filtrations built on top of data. They consist in discrete multisets of points in the plane $mathbb{R}^2$ that can equivalently be seen as discrete measures in $mathbb{R}^2$. When the data come as a random point cloud, these discrete measures become random measures whose expectation is studied in this paper. First, we show that for a wide class of filtrations, including the v{C}ech and Rips-Vietoris filtrations, the expected persistence diagram, that is a deterministic measure on $mathbb{R}^2$ , has a density with respect to the Lebesgue measure. Second, building on the previous result we show that the persistence surface recently introduced in [Adams & al., Persistence images: a stable vector representation of persistent homology] can be seen as a kernel estimator of this density. We propose a cross-validation scheme for selecting an optimal bandwidth, which is proven to be a consistent procedure to estimate the density.","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"42 1","pages":"127-153"},"PeriodicalIF":0.0,"publicationDate":"2018-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79071467","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-02-27DOI: 10.4230/LIPIcs.ESA.2016.7
M. Baum, Thomas Bläsius, Andreas Gemsa, Ignaz Rutter, Franziska Wegner
Isocontours in road networks represent the area that is reachable from a source within a given resource limit. We study the problem of computing accurate isocontours in realistic, large-scale networks. We propose isocontours represented by polygons with minimum number of segments that separate reachable and unreachable components of the network. Since the resulting problem is not known to be solvable in polynomial time, we introduce several heuristics that run in (almost) linear time and are simple enough to be implemented in practice. A key ingredient is a new practical linear-time algorithm for minimum-link paths in simple polygons. Experiments in a challenging realistic setting show excellent performance of our algorithms in practice, computing near-optimal solutions in a few milliseconds on average, even for long ranges.
{"title":"Scalable Exact Visualization of Isocontours in Road Networks via Minimum-Link Paths","authors":"M. Baum, Thomas Bläsius, Andreas Gemsa, Ignaz Rutter, Franziska Wegner","doi":"10.4230/LIPIcs.ESA.2016.7","DOIUrl":"https://doi.org/10.4230/LIPIcs.ESA.2016.7","url":null,"abstract":"Isocontours in road networks represent the area that is reachable from a source within a given resource limit. We study the problem of computing accurate isocontours in realistic, large-scale networks. We propose isocontours represented by polygons with minimum number of segments that separate reachable and unreachable components of the network. Since the resulting problem is not known to be solvable in polynomial time, we introduce several heuristics that run in (almost) linear time and are simple enough to be implemented in practice. A key ingredient is a new practical linear-time algorithm for minimum-link paths in simple polygons. Experiments in a challenging realistic setting show excellent performance of our algorithms in practice, computing near-optimal solutions in a few milliseconds on average, even for long ranges.","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"10 1","pages":"27-73"},"PeriodicalIF":0.0,"publicationDate":"2018-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72814582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yunfeng Hu, M. Hudelson, B. Krishnamoorthy, Altansuren Tumurbaatar, K. Vixie
We introduce and begin to explore the mean and median of finite sets of shapes represented as integral currents. The median can be computed efficiently in practice, and we focus most of our theoretical and computational attention on medians. We consider questions on the existence and regularity of medians. While the median might not exist in all cases, we show that a mass-regularized median is guaranteed to exist. When the input shapes are modeled by integral currents with shared boundaries in codimension $1$, we show that the median is guaranteed to exist, and is contained in the emph{envelope} of the input currents. On the other hand, we show that medians can be emph{wild} in this setting, and smooth inputs can generate non-smooth medians. For higher codimensions, we show that emph{books} are minimizing for a finite set of $1$-currents in $Bbb{R}^3$ with shared boundaries. As part of this proof, we present a new result in graph theory---that emph{cozy} graphs are emph{comfortable}---which should be of independent interest. Further, we show that regular points on the median have book-like tangent cones in this case. From the point of view of computation, we study the median shape in the settings of a finite simplicial complex. When the input shapes are represented by chains of the simplicial complex, we show that the problem of finding the median shape can be formulated as an integer linear program. This optimization problem can be solved as a linear program in practice, thus allowing one to compute median shapes efficiently. We provide open source code implementing our methods, which could also be used by anyone to experiment with ideas of their own. The software could be accessed at href{https://github.com/tbtraltaa/medianshape}{https://github.com/tbtraltaa/medianshape}.
{"title":"Median Shapes","authors":"Yunfeng Hu, M. Hudelson, B. Krishnamoorthy, Altansuren Tumurbaatar, K. Vixie","doi":"10.20382/jocg.v10i1a12","DOIUrl":"https://doi.org/10.20382/jocg.v10i1a12","url":null,"abstract":"We introduce and begin to explore the mean and median of finite sets of shapes represented as integral currents. The median can be computed efficiently in practice, and we focus most of our theoretical and computational attention on medians. We consider questions on the existence and regularity of medians. While the median might not exist in all cases, we show that a mass-regularized median is guaranteed to exist. When the input shapes are modeled by integral currents with shared boundaries in codimension $1$, we show that the median is guaranteed to exist, and is contained in the emph{envelope} of the input currents. On the other hand, we show that medians can be emph{wild} in this setting, and smooth inputs can generate non-smooth medians. For higher codimensions, we show that emph{books} are minimizing for a finite set of $1$-currents in $Bbb{R}^3$ with shared boundaries. As part of this proof, we present a new result in graph theory---that emph{cozy} graphs are emph{comfortable}---which should be of independent interest. Further, we show that regular points on the median have book-like tangent cones in this case. From the point of view of computation, we study the median shape in the settings of a finite simplicial complex. When the input shapes are represented by chains of the simplicial complex, we show that the problem of finding the median shape can be formulated as an integer linear program. This optimization problem can be solved as a linear program in practice, thus allowing one to compute median shapes efficiently. We provide open source code implementing our methods, which could also be used by anyone to experiment with ideas of their own. The software could be accessed at href{https://github.com/tbtraltaa/medianshape}{https://github.com/tbtraltaa/medianshape}.","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"4 1","pages":"322-388"},"PeriodicalIF":0.0,"publicationDate":"2018-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73605209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-01-08DOI: 10.4230/LIPIcs.SoCG.2018.42
A. V. Goethem, Kevin Verbeek
We describe an algorithm that morphs between two planar orthogonal drawings $Gamma_I$ and $Gamma_O$ of a connected graph $G$, while preserving planarity and orthogonality. Necessarily $Gamma_I$ and $Gamma_O$ share the same combinatorial embedding. Our morph uses a linear number of linear morphs (linear interpolations between two drawings) and preserves linear complexity throughout the process, thereby answering an open question from Biedl et al. Our algorithm first unifies the two drawings to ensure an equal number of (virtual) bends on each edge. We then interpret bends as vertices which form obstacles for so-called wires: horizontal and vertical lines separating the vertices of $Gamma_O$. These wires define homotopy classes with respect to the vertices of $G$ (for the combinatorial embedding of $G$ shared by $Gamma_I$ and $Gamma_O$). These homotopy classes can be represented by orthogonal polylines in $Gamma_I$. We argue that the structural difference between the two drawings can be captured by the spirality of the wires in $Gamma_I$, which guides our morph from $Gamma_I$ to $Gamma_O$.
{"title":"Optimal Morphs of Planar Orthogonal Drawings","authors":"A. V. Goethem, Kevin Verbeek","doi":"10.4230/LIPIcs.SoCG.2018.42","DOIUrl":"https://doi.org/10.4230/LIPIcs.SoCG.2018.42","url":null,"abstract":"We describe an algorithm that morphs between two planar orthogonal drawings $Gamma_I$ and $Gamma_O$ of a connected graph $G$, while preserving planarity and orthogonality. Necessarily $Gamma_I$ and $Gamma_O$ share the same combinatorial embedding. Our morph uses a linear number of linear morphs (linear interpolations between two drawings) and preserves linear complexity throughout the process, thereby answering an open question from Biedl et al. \u0000Our algorithm first unifies the two drawings to ensure an equal number of (virtual) bends on each edge. We then interpret bends as vertices which form obstacles for so-called wires: horizontal and vertical lines separating the vertices of $Gamma_O$. These wires define homotopy classes with respect to the vertices of $G$ (for the combinatorial embedding of $G$ shared by $Gamma_I$ and $Gamma_O$). These homotopy classes can be represented by orthogonal polylines in $Gamma_I$. We argue that the structural difference between the two drawings can be captured by the spirality of the wires in $Gamma_I$, which guides our morph from $Gamma_I$ to $Gamma_O$.","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"181 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80257526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The task of creating detailed three dimensional virtual worlds for interactive entertainment software can be simplified by using Constructive Solid Geometry (CSG) techniques. CSG allows artists to combine primitive shapes, visualized through polygons, into complex and believable scenery. Constructive Volume Geometry (CVG) is a super-set of CSG that operates on volumetric data, which consists of values recorded at constant intervals in three dimensions of space. To allow volumetric data to be integrated into existing frameworks, indirect visualization is performed by constructing and visualizing polygon meshes corresponding to the implicit surfaces in the volumetric data. The Indirect CVG (ICVG) algebra, which provides constructive volume geometry operators appropriate to volumetric data that will be indirectly visualized is introduced. ICVG includes operations analogous to the union, difference, and intersection operators in the standard CVG algebra, as well as new operations. Additionally, a series of volumetric primitives well suited to indirect visualization is defined.
{"title":"ICVG : Practical Constructive Volume Geometry for Indirect Visualization","authors":"Mark Laprairie, Howard J. Hamilton, A. Geiger","doi":"10.5121/IJCGA.2017.7401","DOIUrl":"https://doi.org/10.5121/IJCGA.2017.7401","url":null,"abstract":"The task of creating detailed three dimensional virtual worlds for interactive entertainment software can be simplified by using Constructive Solid Geometry (CSG) techniques. CSG allows artists to combine primitive shapes, visualized through polygons, into complex and believable scenery. Constructive Volume Geometry (CVG) is a super-set of CSG that operates on volumetric data, which consists of values recorded at constant intervals in three dimensions of space. To allow volumetric data to be integrated into existing frameworks, indirect visualization is performed by constructing and visualizing polygon meshes corresponding to the implicit surfaces in the volumetric data. The Indirect CVG (ICVG) algebra, which provides constructive volume geometry operators appropriate to volumetric data that will be indirectly visualized is introduced. ICVG includes operations analogous to the union, difference, and intersection operators in the standard CVG algebra, as well as new operations. Additionally, a series of volumetric primitives well suited to indirect visualization is defined.","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"7 1","pages":"1-19"},"PeriodicalIF":0.0,"publicationDate":"2017-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47239133","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-10-01DOI: 10.4230/LIPIcs.SoCG.2018.11
Édouard Bonnet, P. Giannopoulos
A terrain is an x-monotone polygonal curve, i.e., successive vertices have increasing x-coordinates. Terrain Guarding can be seen as a special case of the famous art gallery problem where one has to place at most $k$ guards on a terrain made of $n$ vertices in order to fully see it. In 2010, King and Krohn showed that Terrain Guarding is NP-complete [SODA '10, SIAM J. Comput. '11] thereby solving a long-standing open question. They observe that their proof does not settle the complexity of Orthogonal Terrain Guarding where the terrain only consists of horizontal or vertical segments; those terrains are called rectilinear or orthogonal. Recently, Ashok et al. [SoCG'17] presented an FPT algorithm running in time $k^{O(k)}n^{O(1)}$ for Dominating Set in the visibility graphs of rectilinear terrains without 180-degree vertices. They ask if Orthogonal Terrain Guarding is in P or NP-hard. In the same paper, they give a subexponential-time algorithm running in $n^{O(sqrt n)}$ (actually even $n^{O(sqrt k)}$) for the general Terrain Guarding and notice that the hardness proof of King and Krohn only disproves a running time $2^{o(n^{1/4})}$ under the ETH. Hence, there is a significant gap between their $2^{O(n^{1/2} log n)}$-algorithm and the no $2^{o(n^{1/4})}$ ETH-hardness implied by King and Krohn's result. In this paper, we answer those two remaining questions. We adapt the gadgets of King and Krohn to rectilinear terrains in order to prove that even Orthogonal Terrain Guarding is NP-complete. Then, we show how their reduction from Planar 3-SAT (as well as our adaptation for rectilinear terrains) can actually be made linear (instead of quadratic).
{"title":"Orthogonal Terrain Guarding is NP-complete","authors":"Édouard Bonnet, P. Giannopoulos","doi":"10.4230/LIPIcs.SoCG.2018.11","DOIUrl":"https://doi.org/10.4230/LIPIcs.SoCG.2018.11","url":null,"abstract":"A terrain is an x-monotone polygonal curve, i.e., successive vertices have increasing x-coordinates. Terrain Guarding can be seen as a special case of the famous art gallery problem where one has to place at most $k$ guards on a terrain made of $n$ vertices in order to fully see it. In 2010, King and Krohn showed that Terrain Guarding is NP-complete [SODA '10, SIAM J. Comput. '11] thereby solving a long-standing open question. They observe that their proof does not settle the complexity of Orthogonal Terrain Guarding where the terrain only consists of horizontal or vertical segments; those terrains are called rectilinear or orthogonal. Recently, Ashok et al. [SoCG'17] presented an FPT algorithm running in time $k^{O(k)}n^{O(1)}$ for Dominating Set in the visibility graphs of rectilinear terrains without 180-degree vertices. They ask if Orthogonal Terrain Guarding is in P or NP-hard. In the same paper, they give a subexponential-time algorithm running in $n^{O(sqrt n)}$ (actually even $n^{O(sqrt k)}$) for the general Terrain Guarding and notice that the hardness proof of King and Krohn only disproves a running time $2^{o(n^{1/4})}$ under the ETH. Hence, there is a significant gap between their $2^{O(n^{1/2} log n)}$-algorithm and the no $2^{o(n^{1/4})}$ ETH-hardness implied by King and Krohn's result. In this paper, we answer those two remaining questions. We adapt the gadgets of King and Krohn to rectilinear terrains in order to prove that even Orthogonal Terrain Guarding is NP-complete. Then, we show how their reduction from Planar 3-SAT (as well as our adaptation for rectilinear terrains) can actually be made linear (instead of quadratic).","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"127 1","pages":"21-44"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73397856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-09-04DOI: 10.4230/LIPIcs.IPEC.2017.8
Édouard Bonnet, P. Giannopoulos, M. Lampis
We study the following geometric separation problem: Given a set $R$ of red points and a set $B$ of blue points in the plane, find a minimum-size set of lines that separate $R$ from $B$. We show that, in its full generality, parameterized by the number of lines $k$ in the solution, the problem is unlikely to be solvable significantly faster than the brute-force $n^{O(k)}$-time algorithm, where $n$ is the total number of points. Indeed, we show that an algorithm running in time $f(k)n^{o(k/ log k)}$, for any computable function $f$, would disprove ETH. Our reduction crucially relies on selecting lines from a set with a large number of different slopes (i.e., this number is not a function of $k$). Conjecturing that the problem variant where the lines are required to be axis-parallel is FPT in the number of lines, we show the following preliminary result. Separating $R$ from $B$ with a minimum-size set of axis-parallel lines is FPT in the size of either set, and can be solved in time $O^*(9^{|B|})$ (assuming that $B$ is the smallest set).
{"title":"On the Parameterized Complexity of Red-Blue Points Separation","authors":"Édouard Bonnet, P. Giannopoulos, M. Lampis","doi":"10.4230/LIPIcs.IPEC.2017.8","DOIUrl":"https://doi.org/10.4230/LIPIcs.IPEC.2017.8","url":null,"abstract":"We study the following geometric separation problem: \u0000Given a set $R$ of red points and a set $B$ of blue points in the plane, find a minimum-size set of lines that separate $R$ from $B$. We show that, in its full generality, parameterized by the number of lines $k$ in the solution, the problem is unlikely to be solvable significantly faster than the brute-force $n^{O(k)}$-time algorithm, where $n$ is the total number of points. Indeed, we show that an algorithm running in time $f(k)n^{o(k/ log k)}$, for any computable function $f$, would disprove ETH. Our reduction crucially relies on selecting lines from a set with a large number of different slopes (i.e., this number is not a function of $k$). Conjecturing that the problem variant where the lines are required to be axis-parallel is FPT in the number of lines, we show the following preliminary result. Separating $R$ from $B$ with a minimum-size set of axis-parallel lines is FPT in the size of either set, and can be solved in time $O^*(9^{|B|})$ (assuming that $B$ is the smallest set).","PeriodicalId":54969,"journal":{"name":"International Journal of Computational Geometry & Applications","volume":"78 1","pages":"181-206"},"PeriodicalIF":0.0,"publicationDate":"2017-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84062828","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}