首页 > 最新文献

Bulletin of the Society of Sea Water Science, Japan最新文献

英文 中文
Document Retrieval Hacks 文档检索技巧
Pub Date : 2021-01-01 DOI: 10.4230/LIPIcs.SEA.2021.12
S. Puglisi, Bella Zhukova
Given a collection of strings, document listing refers to the problem of finding all the strings (or documents) where a given query string (or pattern) appears. Index data structures that support efficient document listing for string collections have been the focus of intense research in the last decade, with dozens of papers published describing exotic and elegant compressed data structures. The problem is now quite well understood in theory and many of the solutions have been implemented and evaluated experimentally. A particular recent focus has been on highly repetitive document collections, which have become prevalent in many areas (such as version control systems and genomics – to name just two very different sources). The aim of this paper is to describe simple and efficient document listing algorithms that can be used in combination with more sophisticated techniques, or as baselines against which the performance of new document listing indexes can be measured. Our approaches are based on simple combinations of scanning and hashing, which we show to combine very well with dictionary compression to achieve small space usage. Our experiments show these methods to be often much faster and less space consuming than the best specialized indexes for the problem.
给定一个字符串集合,文档列表指的是查找出现给定查询字符串(或模式)的所有字符串(或文档)的问题。支持字符串集合的高效文档列表的索引数据结构在过去十年中一直是研究的焦点,发表了数十篇论文,描述了奇特而优雅的压缩数据结构。这个问题现在已经在理论上得到了很好的理解,许多解决方案已经在实验中得到了实施和评估。最近特别关注的是高度重复的文档集合,它在许多领域(例如版本控制系统和基因组学—仅举两个非常不同的来源)中变得普遍。本文的目的是描述简单而有效的文档列表算法,这些算法可以与更复杂的技术结合使用,或者作为衡量新文档列表索引性能的基准。我们的方法是基于扫描和散列的简单组合,我们展示了与字典压缩很好地结合以实现小空间使用。我们的实验表明,与针对该问题的最佳专用索引相比,这些方法通常要快得多,占用的空间也更少。
{"title":"Document Retrieval Hacks","authors":"S. Puglisi, Bella Zhukova","doi":"10.4230/LIPIcs.SEA.2021.12","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2021.12","url":null,"abstract":"Given a collection of strings, document listing refers to the problem of finding all the strings (or documents) where a given query string (or pattern) appears. Index data structures that support efficient document listing for string collections have been the focus of intense research in the last decade, with dozens of papers published describing exotic and elegant compressed data structures. The problem is now quite well understood in theory and many of the solutions have been implemented and evaluated experimentally. A particular recent focus has been on highly repetitive document collections, which have become prevalent in many areas (such as version control systems and genomics – to name just two very different sources). The aim of this paper is to describe simple and efficient document listing algorithms that can be used in combination with more sophisticated techniques, or as baselines against which the performance of new document listing indexes can be measured. Our approaches are based on simple combinations of scanning and hashing, which we show to combine very well with dictionary compression to achieve small space usage. Our experiments show these methods to be often much faster and less space consuming than the best specialized indexes for the problem.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"65 1","pages":"12:1-12:12"},"PeriodicalIF":0.0,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82812154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
How to Find the Exit from a 3-Dimensional Maze 如何从三维迷宫中找到出口
Pub Date : 2021-01-01 DOI: 10.4230/LIPIcs.SEA.2021.21
M. Hermann
We present several experimental algorithms for fast computation of variadic polynomials over non-negative integers. 2012 ACM Subject Classification Theory of computation → Theory and algorithms for application domains
提出了几种快速计算非负整数变进多项式的实验算法。2012 ACM学科分类:计算理论→应用领域的理论与算法
{"title":"How to Find the Exit from a 3-Dimensional Maze","authors":"M. Hermann","doi":"10.4230/LIPIcs.SEA.2021.21","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2021.21","url":null,"abstract":"We present several experimental algorithms for fast computation of variadic polynomials over non-negative integers. 2012 ACM Subject Classification Theory of computation → Theory and algorithms for application domains","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"3 1","pages":"21:1-21:12"},"PeriodicalIF":0.0,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85507678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Experimental Study of External Memory Algorithms for Connected Components 连接元件外部存储算法的实验研究
Pub Date : 2021-01-01 DOI: 10.4230/LIPIcs.SEA.2021.23
G. Brodal, Rolf Fagerberg, David Hammer, U. Meyer, M. Penschuck, Hung Tran
We empirically investigate algorithms for solving Connected Components in the external memory model. In particular, we study whether the randomized O(Sort(E)) algorithm by Karger, Klein, and Tarjan can be implemented to compete with practically promising and simpler algorithms having only slightly worse theoretical cost, namely Borůvka’s algorithm and the algorithm by Sibeyn and collaborators. For all algorithms, we develop and test a number of tuning options. Our experiments are executed on a large set of different graph classes including random graphs, grids, geometric graphs, and hyperbolic graphs. Among our findings are: The Sibeyn algorithm is a very strong contender due to its simplicity and due to an added degree of freedom in its internal workings when used in the Connected Components setting. With the right tunings, the Karger-Klein-Tarjan algorithm can be implemented to be competitive in many cases. Higher graph density seems to benefit Karger-Klein-Tarjan relative to Sibeyn. Borůvka’s algorithm is not competitive with the two others. 2012 ACM Subject Classification Mathematics of computing → Paths and connectivity problems; Theory of computation → Graph algorithms analysis
我们实证研究了解决外部存储器模型中连接组件的算法。我们特别研究了Karger, Klein和Tarjan的随机化O(Sort(E))算法是否可以实现,以与理论成本稍差的具有实际前景和更简单的算法,即Borůvka的算法和Sibeyn及其合作者的算法竞争。对于所有算法,我们开发和测试了许多调优选项。我们的实验是在大量不同的图类上进行的,包括随机图、网格图、几何图和双曲图。我们的发现包括:Sibeyn算法是一个非常强大的竞争者,因为它的简单性,并且在连接组件设置中使用时,它的内部工作增加了自由度。通过适当的调整,kager - klein - tarjan算法可以在许多情况下具有竞争力。相对于Sibeyn,更高的图形密度似乎对kager - klein - tarjan有利。Borůvka的算法与其他两种算法没有竞争关系。2012 ACM学科分类计算数学→路径和连通性问题;计算理论→图算法分析
{"title":"An Experimental Study of External Memory Algorithms for Connected Components","authors":"G. Brodal, Rolf Fagerberg, David Hammer, U. Meyer, M. Penschuck, Hung Tran","doi":"10.4230/LIPIcs.SEA.2021.23","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2021.23","url":null,"abstract":"We empirically investigate algorithms for solving Connected Components in the external memory model. In particular, we study whether the randomized O(Sort(E)) algorithm by Karger, Klein, and Tarjan can be implemented to compete with practically promising and simpler algorithms having only slightly worse theoretical cost, namely Borůvka’s algorithm and the algorithm by Sibeyn and collaborators. For all algorithms, we develop and test a number of tuning options. Our experiments are executed on a large set of different graph classes including random graphs, grids, geometric graphs, and hyperbolic graphs. Among our findings are: The Sibeyn algorithm is a very strong contender due to its simplicity and due to an added degree of freedom in its internal workings when used in the Connected Components setting. With the right tunings, the Karger-Klein-Tarjan algorithm can be implemented to be competitive in many cases. Higher graph density seems to benefit Karger-Klein-Tarjan relative to Sibeyn. Borůvka’s algorithm is not competitive with the two others. 2012 ACM Subject Classification Mathematics of computing → Paths and connectivity problems; Theory of computation → Graph algorithms analysis","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"75 1","pages":"23:1-23:23"},"PeriodicalIF":0.0,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91029855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Space and Time Trade-Off for the k Shortest Simple Paths Problem k个最短简单路径问题的时空权衡
Pub Date : 2020-06-16 DOI: 10.4230/LIPICS.SEA.2020.18
Ali Al Zoobi, D. Coudert, N. Nisse
The k shortest simple path problem (kSSP) asks to compute a set of top-k shortest simple paths from a vertex s to a vertex t in a digraph. Yen (1971) proposed the first algorithm with the best known theoretical complexity of O(kn(m + n log n)) for a digraph with n vertices and m arcs. Since then, the problem has been widely studied from an algorithm engineering perspective, and impressive improvements have been achieved. In particular, Kurz and Mutzel (2016) proposed a sidetracks-based (SB) algorithm which is currently the fastest solution. In this work, we propose two improvements of this algorithm. We first show how to speed up the SB algorithm using dynamic updates of shortest path trees. We did experiments on some road networks of the 9th DIMAC'S challenge with up to about half a million nodes and one million arcs. Our computational results show an average speed up by a factor of 1.5 to 2 with a similar working memory consumption as SB. We then propose a second algorithm enabling to significantly reduce the working memory at the cost of an increase of the running time (up to two times slower). Our experiments on the same data set show, on average, a reduction by a factor of 1.5 to 2 of the working memory.
k个最短简单路径问题(kSSP)要求计算有向图中从顶点s到顶点t的top-k个最短简单路径的集合。Yen(1971)提出了第一个算法,其最著名的理论复杂度为O(kn(m + n log n)),用于具有n个顶点和m条弧的有向图。从那时起,从算法工程的角度对这个问题进行了广泛的研究,并取得了令人印象深刻的改进。特别是,Kurz和Mutzel(2016)提出了一种基于侧边的(SB)算法,这是目前最快的解决方案。在这项工作中,我们对该算法提出了两个改进。我们首先展示了如何使用最短路径树的动态更新来加速SB算法。我们在第九届DIMAC挑战赛的一些道路网络上做了实验,其中有大约50万个节点和100万个弧线。我们的计算结果表明,在工作记忆消耗与SB相似的情况下,平均速度提高了1.5到2倍。然后,我们提出了第二种算法,能够以增加运行时间为代价显著减少工作记忆(最多慢两倍)。我们在相同数据集上的实验显示,平均而言,工作记忆减少了1.5到2倍。
{"title":"Space and Time Trade-Off for the k Shortest Simple Paths Problem","authors":"Ali Al Zoobi, D. Coudert, N. Nisse","doi":"10.4230/LIPICS.SEA.2020.18","DOIUrl":"https://doi.org/10.4230/LIPICS.SEA.2020.18","url":null,"abstract":"The k shortest simple path problem (kSSP) asks to compute a set of top-k shortest simple paths from a vertex s to a vertex t in a digraph. Yen (1971) proposed the first algorithm with the best known theoretical complexity of O(kn(m + n log n)) for a digraph with n vertices and m arcs. Since then, the problem has been widely studied from an algorithm engineering perspective, and impressive improvements have been achieved. In particular, Kurz and Mutzel (2016) proposed a sidetracks-based (SB) algorithm which is currently the fastest solution. In this work, we propose two improvements of this algorithm. We first show how to speed up the SB algorithm using dynamic updates of shortest path trees. We did experiments on some road networks of the 9th DIMAC'S challenge with up to about half a million nodes and one million arcs. Our computational results show an average speed up by a factor of 1.5 to 2 with a similar working memory consumption as SB. We then propose a second algorithm enabling to significantly reduce the working memory at the cost of an increase of the running time (up to two times slower). Our experiments on the same data set show, on average, a reduction by a factor of 1.5 to 2 of the working memory.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"5 1","pages":"18:1-18:13"},"PeriodicalIF":0.0,"publicationDate":"2020-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76952465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Probing a Set of Trajectories to Maximize Captured Information 探测一组轨迹以最大化捕获的信息
Pub Date : 2020-04-07 DOI: 10.4230/LIPIcs.SEA.2020.5
S. Fekete, Alexander Hill, Dominik Krupke, Tyler Mayer, Joseph S. B. Mitchell, Ojas D. Parekh, C. Phillips
We study a trajectory analysis problem we call the Trajectory Capture Problem (TCP), in which, for a given input set ${cal T}$ of trajectories in the plane, and an integer $kgeq 2$, we seek to compute a set of $k$ points (``portals'') to maximize the total weight of all subtrajectories of ${cal T}$ between pairs of portals. This problem naturally arises in trajectory analysis and summarization. We show that the TCP is NP-hard (even in very special cases) and give some first approximation results. Our main focus is on attacking the TCP with practical algorithm-engineering approaches, including integer linear programming (to solve instances to provable optimality) and local search methods. We study the integrality gap arising from such approaches. We analyze our methods on different classes of data, including benchmark instances that we generate. Our goal is to understand the best performing heuristics, based on both solution time and solution quality. We demonstrate that we are able to compute provably optimal solutions for real-world instances.
我们研究了一个轨迹分析问题,我们称之为轨迹捕获问题(TCP),其中,对于给定的平面上的轨迹集${cal T}$和一个整数$kgeq 2$,我们寻求计算一组$k$点(“门户”)以最大化门户对之间${cal T}$的所有子轨迹的总权重。在轨迹分析和总结中自然会出现这个问题。我们证明了TCP是np困难的(即使在非常特殊的情况下),并给出了一些初步的近似结果。我们的主要重点是用实用的算法工程方法攻击TCP,包括整数线性规划(解决可证明最优性的实例)和本地搜索方法。我们研究了由这些方法引起的完整性缺口。我们在不同类型的数据上分析我们的方法,包括我们生成的基准测试实例。我们的目标是了解基于解决方案时间和解决方案质量的最佳启发式。我们证明了我们能够为现实世界的实例计算可证明的最优解决方案。
{"title":"Probing a Set of Trajectories to Maximize Captured Information","authors":"S. Fekete, Alexander Hill, Dominik Krupke, Tyler Mayer, Joseph S. B. Mitchell, Ojas D. Parekh, C. Phillips","doi":"10.4230/LIPIcs.SEA.2020.5","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2020.5","url":null,"abstract":"We study a trajectory analysis problem we call the Trajectory Capture Problem (TCP), in which, for a given input set ${cal T}$ of trajectories in the plane, and an integer $kgeq 2$, we seek to compute a set of $k$ points (``portals'') to maximize the total weight of all subtrajectories of ${cal T}$ between pairs of portals. This problem naturally arises in trajectory analysis and summarization. \u0000We show that the TCP is NP-hard (even in very special cases) and give some first approximation results. Our main focus is on attacking the TCP with practical algorithm-engineering approaches, including integer linear programming (to solve instances to provable optimality) and local search methods. We study the integrality gap arising from such approaches. We analyze our methods on different classes of data, including benchmark instances that we generate. Our goal is to understand the best performing heuristics, based on both solution time and solution quality. We demonstrate that we are able to compute provably optimal solutions for real-world instances.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"89 2 1","pages":"5:1-5:14"},"PeriodicalIF":0.0,"publicationDate":"2020-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90976799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Variable Shift SDD: A More Succinct Sentential Decision Diagram 可变移位SDD:一个更简洁的句子决策图
Pub Date : 2020-04-01 DOI: 10.4230/LIPIcs.SEA.2020.22
Kengo Nakamura, Shuhei Denzumi, Masaaki Nishino
The Sentential Decision Diagram (SDD) is a tractable representation of Boolean functions that subsumes the famous Ordered Binary Decision Diagram (OBDD) as a strict subset. SDDs are attracting much attention because they are more succinct than OBDDs, as well as having canonical forms and supporting many useful queries and transformations such as model counting and Apply operation. In this paper, we propose a more succinct variant of SDD named Variable Shift SDD (VS-SDD). The key idea is to create a unique representation for Boolean functions that are equivalent under a specific variable substitution. We show that VS-SDDs are never larger than SDDs and there are cases in which the size of a VS-SDD is exponentially smaller than that of an SDD. Moreover, despite such succinctness, we show that numerous basic operations that are supported in polytime with SDD are also supported in polytime with VS-SDD. Experiments confirm that VS-SDDs are significantly more succinct than SDDs when applied to classical planning instances, where inherent symmetry exists.
句子决策图(SDD)是布尔函数的一种易于处理的表示,它将著名的有序二进制决策图(OBDD)作为严格子集。由于sdo比obdd更简洁,并且具有规范化的形式,并支持许多有用的查询和转换(如模型计数和Apply操作),因此备受关注。在本文中,我们提出了一种更简洁的SDD变体,称为可变移位SDD (VS-SDD)。关键思想是为在特定变量替换下等效的布尔函数创建唯一表示。我们表明,VS-SDD的大小永远不会大于SDD,并且在某些情况下,VS-SDD的大小会以指数方式小于SDD的大小。此外,尽管如此简洁,我们证明了在使用SDD的polytime中支持的许多基本操作在使用VS-SDD的polytime中也支持。实验证实,当应用于固有对称性存在的经典规划实例时,VS-SDDs明显比SDDs更简洁。
{"title":"Variable Shift SDD: A More Succinct Sentential Decision Diagram","authors":"Kengo Nakamura, Shuhei Denzumi, Masaaki Nishino","doi":"10.4230/LIPIcs.SEA.2020.22","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2020.22","url":null,"abstract":"The Sentential Decision Diagram (SDD) is a tractable representation of Boolean functions that subsumes the famous Ordered Binary Decision Diagram (OBDD) as a strict subset. SDDs are attracting much attention because they are more succinct than OBDDs, as well as having canonical forms and supporting many useful queries and transformations such as model counting and Apply operation. In this paper, we propose a more succinct variant of SDD named Variable Shift SDD (VS-SDD). The key idea is to create a unique representation for Boolean functions that are equivalent under a specific variable substitution. We show that VS-SDDs are never larger than SDDs and there are cases in which the size of a VS-SDD is exponentially smaller than that of an SDD. Moreover, despite such succinctness, we show that numerous basic operations that are supported in polytime with SDD are also supported in polytime with VS-SDD. Experiments confirm that VS-SDDs are significantly more succinct than SDDs when applied to classical planning instances, where inherent symmetry exists.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"5 1","pages":"22:1-22:13"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75226379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Efficient Route Planning with Temporary Driving Bans, Road Closures, and Rated Parking Areas 有效的路线规划与临时驾驶禁令,道路封闭,和额定停车区
Pub Date : 2020-04-01 DOI: 10.4230/LIPIcs.SEA.2020.17
A. Kleff, F. Schulz, Jakob Wagenblatt, Tim Zeitz
We study the problem of planning routes in road networks when certain streets or areas are closed at certain times. For heavy vehicles, such areas may be very large since many European countries impose temporary driving bans during the night or on weekends. In this setting, feasible routes may require waiting at parking areas, and several feasible routes with different trade-offs between waiting and driving detours around closed areas may exist. We propose a novel model in which driving and waiting are assigned abstract costs, and waiting costs are location-dependent to reflect the different quality of the parking areas. Our goal is to find Pareto-optimal routes with regards to arrival time at the destination and total cost. We investigate the complexity of the model and determine a necessary constraint on the cost parameters such that the problem is solvable in polynomial time. We present a thoroughly engineered implementation and perform experiments on a production-grade real world data set. The experiments show that our implementation can answer realistic queries in around a second or less which makes it feasible for practical application.
我们研究了当某些街道或地区在某些时间关闭时,道路网络中的路线规划问题。对于重型车辆,这些区域可能非常大,因为许多欧洲国家在夜间或周末实行临时驾驶禁令。在这种情况下,可行的路线可能需要在停车区域等待,并且可能存在几种可行的路线,在封闭区域周围的等待和驾驶绕道之间存在不同的权衡。我们提出了一种新的模型,该模型将驾驶和等待成本分配为抽象成本,并且等待成本是位置相关的,以反映停车区域的不同质量。我们的目标是找到关于到达目的地时间和总成本的帕累托最优路线。我们研究了模型的复杂性,并确定了代价参数的必要约束,使得问题在多项式时间内可解。我们提出了一个彻底的工程实现,并在一个生产级的真实世界数据集上进行了实验。实验表明,我们的实现可以在1秒或更短的时间内回答实际问题,具有实际应用的可行性。
{"title":"Efficient Route Planning with Temporary Driving Bans, Road Closures, and Rated Parking Areas","authors":"A. Kleff, F. Schulz, Jakob Wagenblatt, Tim Zeitz","doi":"10.4230/LIPIcs.SEA.2020.17","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2020.17","url":null,"abstract":"We study the problem of planning routes in road networks when certain streets or areas are closed at certain times. For heavy vehicles, such areas may be very large since many European countries impose temporary driving bans during the night or on weekends. In this setting, feasible routes may require waiting at parking areas, and several feasible routes with different trade-offs between waiting and driving detours around closed areas may exist. We propose a novel model in which driving and waiting are assigned abstract costs, and waiting costs are location-dependent to reflect the different quality of the parking areas. Our goal is to find Pareto-optimal routes with regards to arrival time at the destination and total cost. We investigate the complexity of the model and determine a necessary constraint on the cost parameters such that the problem is solvable in polynomial time. We present a thoroughly engineered implementation and perform experiments on a production-grade real world data set. The experiments show that our implementation can answer realistic queries in around a second or less which makes it feasible for practical application.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"268 Pt A 1","pages":"17:1-17:13"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90812403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
An Algorithm for the Exact Treedepth Problem 精确树深问题的一种算法
Pub Date : 2020-04-01 DOI: 10.4230/LIPICS.SEA.2020.19
James Trimble
We present a novel algorithm for the minimum-depth elimination tree problem, which is equivalent to the optimal treedepth decomposition problem. Our algorithm makes use of two cheaply-computed lower bound functions to prune the search tree, along with symmetry-breaking and domination rules. We present an empirical study showing that the algorithm outperforms the current state-of-the-art solver (which is based on a SAT encoding) by orders of magnitude on a range of graph classes.
提出了一种求解最小深度消去树问题的新算法,该算法等价于最优树深度分解问题。我们的算法利用两个低成本计算的下界函数来修剪搜索树,以及对称破坏和支配规则。我们提出了一项实证研究,表明该算法在一系列图类上优于当前最先进的求解器(基于SAT编码)的数量级。
{"title":"An Algorithm for the Exact Treedepth Problem","authors":"James Trimble","doi":"10.4230/LIPICS.SEA.2020.19","DOIUrl":"https://doi.org/10.4230/LIPICS.SEA.2020.19","url":null,"abstract":"We present a novel algorithm for the minimum-depth elimination tree problem, which is equivalent to the optimal treedepth decomposition problem. Our algorithm makes use of two cheaply-computed lower bound functions to prune the search tree, along with symmetry-breaking and domination rules. We present an empirical study showing that the algorithm outperforms the current state-of-the-art solver (which is based on a SAT encoding) by orders of magnitude on a range of graph classes.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"57 1","pages":"19:1-19:14"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77774287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Crystal Structure Prediction via Oblivious Local Search 基于遗忘局部搜索的晶体结构预测
Pub Date : 2020-03-20 DOI: 10.4230/LIPIcs.SEA.2020.21
Dmytro Antypov, Argyrios Deligkas, V. Gusev, M. Rosseinsky, P. Spirakis, Michail Theofilatos
We study Crystal Structure Prediction, one of the major problems in computational chemistry. This is essentially a continuous optimization problem, where many different, simple and sophisticated, methods have been proposed and applied. The simple searching techniques are easy to understand, usually easy to implement, but they can be slow in practice. On the other hand, the more sophisticated approaches perform well in general, however almost all of them have a large number of parameters that require fine tuning and, in the majority of the cases, chemical expertise is needed in order to properly set them up. In addition, due to the chemical expertise involved in the parameter-tuning, these approaches can be {em biased} towards previously-known crystal structures. Our contribution is twofold. Firstly, we formalize the Crystal Structure Prediction problem, alongside several other intermediate problems, from a theoretical computer science perspective. Secondly, we propose an oblivious algorithm for Crystal Structure Prediction that is based on local search. Oblivious means that our algorithm requires minimal knowledge about the composition we are trying to compute a crystal structure for. In addition, our algorithm can be used as an intermediate step by {em any} method. Our experiments show that our algorithms outperform the standard basin hopping, a well studied algorithm for the problem.
我们研究晶体结构预测,这是计算化学的主要问题之一。这本质上是一个连续优化问题,许多不同的、简单的和复杂的方法已经被提出和应用。简单的搜索技术易于理解,通常易于实现,但在实践中可能很慢。另一方面,更复杂的方法通常表现良好,但几乎所有这些方法都有大量需要微调的参数,并且在大多数情况下,需要化学专业知识才能正确设置它们。此外,由于涉及参数调谐的化学专业知识,这些方法可能对先前已知的晶体结构有偏倚。我们的贡献是双重的。首先,我们从理论计算机科学的角度形式化了晶体结构预测问题以及其他几个中间问题。其次,我们提出了一种基于局部搜索的遗忘晶体结构预测算法。遗忘意味着我们的算法只需要对我们试图计算的晶体结构的组成有最小的了解。此外,我们的算法可以通过{em any}方法作为中间步骤使用。我们的实验表明,我们的算法优于标准的盆地跳跃算法,这是一个研究得很好的算法。
{"title":"Crystal Structure Prediction via Oblivious Local Search","authors":"Dmytro Antypov, Argyrios Deligkas, V. Gusev, M. Rosseinsky, P. Spirakis, Michail Theofilatos","doi":"10.4230/LIPIcs.SEA.2020.21","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2020.21","url":null,"abstract":"We study Crystal Structure Prediction, one of the major problems in computational chemistry. This is essentially a continuous optimization problem, where many different, simple and sophisticated, methods have been proposed and applied. The simple searching techniques are easy to understand, usually easy to implement, but they can be slow in practice. On the other hand, the more sophisticated approaches perform well in general, however almost all of them have a large number of parameters that require fine tuning and, in the majority of the cases, chemical expertise is needed in order to properly set them up. In addition, due to the chemical expertise involved in the parameter-tuning, these approaches can be {em biased} towards previously-known crystal structures. Our contribution is twofold. Firstly, we formalize the Crystal Structure Prediction problem, alongside several other intermediate problems, from a theoretical computer science perspective. Secondly, we propose an oblivious algorithm for Crystal Structure Prediction that is based on local search. Oblivious means that our algorithm requires minimal knowledge about the composition we are trying to compute a crystal structure for. In addition, our algorithm can be used as an intermediate step by {em any} method. Our experiments show that our algorithms outperform the standard basin hopping, a well studied algorithm for the problem.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"15 1","pages":"21:1-21:14"},"PeriodicalIF":0.0,"publicationDate":"2020-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74939231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Faster Fully Dynamic Transitive Closure in Practice 更快的全动态传递闭包
Pub Date : 2020-02-03 DOI: 10.4230/LIPIcs.SEA.2020.14
Kathrin Hanauer, M. Henzinger, Christian Schulz
The fully dynamic transitive closure problem asks to maintain reachability information in a directed graph between arbitrary pairs of vertices, while the graph undergoes a sequence of edge insertions and deletions. The problem has been thoroughly investigated in theory and many specialized algorithms for solving it have been proposed in the last decades. In two large studies [Frigioni ea, 2001; Krommidas and Zaroliagis, 2008], a number of these algorithms have been evaluated experimentally against simple static algorithms for graph traversal, showing the competitiveness and even superiority of the simple algorithms in practice, except for very dense random graphs or very high ratios of queries. A major drawback of those studies is that only small and mostly randomly generated graphs are considered. In this paper, we engineer new algorithms to maintain all-pairs reachability information which are simple and space-efficient. Moreover, we perform an extensive experimental evaluation on both generated and real-world instances that are several orders of magnitude larger than those in the previous studies. Our results indicate that our new algorithms outperform all state-of-the-art algorithms on all types of input considerably in practice.
全动态传递闭包问题要求在有向图中保持任意顶点对之间的可达性信息,同时图要经历一系列的边插入和边删除。在过去的几十年里,人们对这个问题进行了深入的理论研究,并提出了许多专门的算法来解决这个问题。在两项大型研究中[Frigioni等,2001;Krommidas和Zaroliagis, 2008],这些算法中的许多已经与简单的静态算法进行了实验评估,用于图遍历,显示了简单算法在实践中的竞争力甚至优势,除了非常密集的随机图或非常高的查询比率。这些研究的一个主要缺点是只考虑小的和随机生成的图表。在本文中,我们设计了一种新的算法来维护全对可达性信息,该算法简单且节省空间。此外,我们对生成的和现实世界的实例进行了广泛的实验评估,这些实例比以前的研究大几个数量级。我们的研究结果表明,在实践中,我们的新算法在所有类型的输入上都明显优于所有最先进的算法。
{"title":"Faster Fully Dynamic Transitive Closure in Practice","authors":"Kathrin Hanauer, M. Henzinger, Christian Schulz","doi":"10.4230/LIPIcs.SEA.2020.14","DOIUrl":"https://doi.org/10.4230/LIPIcs.SEA.2020.14","url":null,"abstract":"The fully dynamic transitive closure problem asks to maintain reachability information in a directed graph between arbitrary pairs of vertices, while the graph undergoes a sequence of edge insertions and deletions. The problem has been thoroughly investigated in theory and many specialized algorithms for solving it have been proposed in the last decades. In two large studies [Frigioni ea, 2001; Krommidas and Zaroliagis, 2008], a number of these algorithms have been evaluated experimentally against simple static algorithms for graph traversal, showing the competitiveness and even superiority of the simple algorithms in practice, except for very dense random graphs or very high ratios of queries. A major drawback of those studies is that only small and mostly randomly generated graphs are considered. \u0000In this paper, we engineer new algorithms to maintain all-pairs reachability information which are simple and space-efficient. Moreover, we perform an extensive experimental evaluation on both generated and real-world instances that are several orders of magnitude larger than those in the previous studies. Our results indicate that our new algorithms outperform all state-of-the-art algorithms on all types of input considerably in practice.","PeriodicalId":9448,"journal":{"name":"Bulletin of the Society of Sea Water Science, Japan","volume":"43 1","pages":"14:1-14:14"},"PeriodicalIF":0.0,"publicationDate":"2020-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81435471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
期刊
Bulletin of the Society of Sea Water Science, Japan
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1