{"title":"Liquid Amortization: Proving Amortized Complexity with LiquidHaskell (Functional Pearl)","authors":"Jan van Brügge","doi":"arxiv-2407.13671","DOIUrl":null,"url":null,"abstract":"Formal reasoning about the time complexity of algorithms and data structures\nis usually done in interactive theorem provers like Isabelle/HOL. This includes\nreasoning about amortized time complexity which looks at the worst case\nperformance over a series of operations. However, most programs are not written\nwithin a theorem prover and thus use the data structures of the production\nlanguage. To verify the correctness it is necessary to translate the data\nstructures from the production language into the language of the prover. Such a\ntranslation step could introduce errors, for example due to a mismatch in\nfeatures between the two languages. We show how to prove amortized complexity\nof data structures directly in Haskell using LiquidHaskell. Besides skipping\nthe translation step, our approach can also provide a didactic advantage.\nLearners do not have to learn an additional language for proofs and can focus\non the new concepts only. For this paper, we do not assume prior knowledge of\namortized complexity as we explain the concepts and apply them in our first\ncase study, a simple stack with multipop. Moving to more complicated (and\nuseful) data structures, we show that the same technique works for binomial\nheaps which can be used to implement a priority queue. We also prove amortized\ncomplexity bounds for Claessen's version of the finger tree, a sequence-like\ndata structure with constant-time cons/uncons on either end. Finally we discuss\nthe current limitations of LiquidHaskell that made certain versions of the data\nstructures not feasible.","PeriodicalId":501024,"journal":{"name":"arXiv - CS - Computational Complexity","volume":"95 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computational Complexity","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.13671","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Formal reasoning about the time complexity of algorithms and data structures
is usually done in interactive theorem provers like Isabelle/HOL. This includes
reasoning about amortized time complexity which looks at the worst case
performance over a series of operations. However, most programs are not written
within a theorem prover and thus use the data structures of the production
language. To verify the correctness it is necessary to translate the data
structures from the production language into the language of the prover. Such a
translation step could introduce errors, for example due to a mismatch in
features between the two languages. We show how to prove amortized complexity
of data structures directly in Haskell using LiquidHaskell. Besides skipping
the translation step, our approach can also provide a didactic advantage.
Learners do not have to learn an additional language for proofs and can focus
on the new concepts only. For this paper, we do not assume prior knowledge of
amortized complexity as we explain the concepts and apply them in our first
case study, a simple stack with multipop. Moving to more complicated (and
useful) data structures, we show that the same technique works for binomial
heaps which can be used to implement a priority queue. We also prove amortized
complexity bounds for Claessen's version of the finger tree, a sequence-like
data structure with constant-time cons/uncons on either end. Finally we discuss
the current limitations of LiquidHaskell that made certain versions of the data
structures not feasible.