首页 > 最新文献

International Conference on Formal Structures for Computation and Deduction最新文献

英文 中文
Cyclic Proofs for Arithmetical Inductive Definitions 算术归纳定义的循环证明
Pub Date : 2023-06-14 DOI: 10.4230/LIPIcs.FSCD.2023.27
We investigate the cyclic proof theory of extensions of Peano Arithmetic by (finitely iterated) inductive definitions. Such theories are essential to proof theoretic analyses of certain `impredicative' theories; moreover, our cyclic systems naturally subsume Simpson's Cyclic Arithmetic. Our main result is that cyclic and inductive systems for arithmetical inductive definitions are equally powerful. We conduct a metamathematical argument, formalising the soundness of cyclic proofs within second-order arithmetic by a form of induction on closure ordinals, thence appealing to conservativity results. This approach is inspired by those of Simpson and Das for Cyclic Arithmetic, however we must further address a difficulty: the closure ordinals of our inductive definitions (around Church-Kleene) far exceed the proof theoretic ordinal of the appropriate metatheory (around Bachmann-Howard), so explicit induction on their notations is not possible. For this reason, we rather rely on formalisation of the theory of (recursive) ordinals within second-order arithmetic.
利用(有限迭代的)归纳定义研究了皮亚诺算法扩展的循环证明理论。这些理论对于某些“谓词”理论的证明理论分析是必不可少的;此外,我们的循环系统自然包含辛普森循环算法。我们的主要结论是循环系统和算术归纳定义的归纳系统是同等强大的。我们进行了一个元数学论证,通过对闭包序数的一种归纳法形式形式化二阶算术内循环证明的健全性,从而诉诸于保守性结果。这种方法受到了Simpson和Das的循环算法的启发,但是我们必须进一步解决一个困难:我们归纳定义的闭包序数(围绕Church-Kleene)远远超过适当元理论的证明论序数(围绕Bachmann-Howard),因此对它们的符号进行显式归纳是不可能的。出于这个原因,我们更依赖于二阶算术中(递归)序数理论的形式化。
{"title":"Cyclic Proofs for Arithmetical Inductive Definitions","authors":"Anupam Das, Lukas Melgaard","doi":"10.4230/LIPIcs.FSCD.2023.27","DOIUrl":"https://doi.org/10.4230/LIPIcs.FSCD.2023.27","url":null,"abstract":"We investigate the cyclic proof theory of extensions of Peano Arithmetic by (finitely iterated) inductive definitions. Such theories are essential to proof theoretic analyses of certain `impredicative' theories; moreover, our cyclic systems naturally subsume Simpson's Cyclic Arithmetic. Our main result is that cyclic and inductive systems for arithmetical inductive definitions are equally powerful. We conduct a metamathematical argument, formalising the soundness of cyclic proofs within second-order arithmetic by a form of induction on closure ordinals, thence appealing to conservativity results. This approach is inspired by those of Simpson and Das for Cyclic Arithmetic, however we must further address a difficulty: the closure ordinals of our inductive definitions (around Church-Kleene) far exceed the proof theoretic ordinal of the appropriate metatheory (around Bachmann-Howard), so explicit induction on their notations is not possible. For this reason, we rather rely on formalisation of the theory of (recursive) ordinals within second-order arithmetic.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124683267","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Logical Essence of Compiling With Continuations 续编的逻辑本质
Pub Date : 2023-04-28 DOI: 10.48550/arXiv.2304.14752
The essence of compiling with continuations is that conversion to continuation-passing style (CPS) is equivalent to a source language transformation converting to administrative normal form (ANF). Taking as source language Moggi's computational lambda-calculus (lbc), we define an alternative to the CPS-translation with target in the sequent calculus LJQ, named value-filling style (VFS) translation, and making use of the ability of the sequent calculus to represent contexts formally. The VFS-translation requires no type translation: indeed, double negations are introduced only when encoding the VFS target language in the CPS target language. This optional encoding, when composed with the VFS-translation reconstructs the original CPS-translation. Going back to direct style, the"essence"of the VFS-translation is that it reveals a new sublanguage of ANF, the value-enclosed style (VES), next to another one, the continuation-enclosing style (CES): such an alternative is due to a dilemma in the syntax of lbc, concerning how to expand the application constructor. In the typed scenario, VES and CES correspond to an alternative between two proof systems for call-by-value, LJQ and natural deduction with generalized applications, confirming proof theory as a foundation for intermediate representations.
使用延续进行编译的本质是,到延续传递样式(CPS)的转换相当于源语言转换到管理范式(ANF)。以Moggi的计算lambda-calculus (lbc)为源语言,在序贯演算LJQ中定义了一种替代CPS-translation的方法,命名为value-filling style (VFS) translation,并利用序贯演算的能力形式化地表示上下文。VFS翻译不需要类型翻译:实际上,只有在用CPS目标语言编码VFS目标语言时才会引入双重否定。当与VFS-translation组合时,这个可选编码将重建原始的CPS-translation。回到直接风格,vfs翻译的“本质”在于它揭示了ANF的一种新的子语言,即值封闭风格(VES),以及另一种延续封闭风格(CES):这种替代方法是由于lbc语法中的一个困境,即如何扩展应用程序构造函数。在类型化场景中,VES和CES对应于具有广义应用的按值调用、LJQ和自然演绎两种证明系统之间的一种替代,确认了证明理论作为中间表示的基础。
{"title":"The Logical Essence of Compiling With Continuations","authors":"J. E. Santo, Filipa Mendes","doi":"10.48550/arXiv.2304.14752","DOIUrl":"https://doi.org/10.48550/arXiv.2304.14752","url":null,"abstract":"The essence of compiling with continuations is that conversion to continuation-passing style (CPS) is equivalent to a source language transformation converting to administrative normal form (ANF). Taking as source language Moggi's computational lambda-calculus (lbc), we define an alternative to the CPS-translation with target in the sequent calculus LJQ, named value-filling style (VFS) translation, and making use of the ability of the sequent calculus to represent contexts formally. The VFS-translation requires no type translation: indeed, double negations are introduced only when encoding the VFS target language in the CPS target language. This optional encoding, when composed with the VFS-translation reconstructs the original CPS-translation. Going back to direct style, the\"essence\"of the VFS-translation is that it reveals a new sublanguage of ANF, the value-enclosed style (VES), next to another one, the continuation-enclosing style (CES): such an alternative is due to a dilemma in the syntax of lbc, concerning how to expand the application constructor. In the typed scenario, VES and CES correspond to an alternative between two proof systems for call-by-value, LJQ and natural deduction with generalized applications, confirming proof theory as a foundation for intermediate representations.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117073881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Compositional Confluence Criteria 合成合流标准
Pub Date : 2023-03-07 DOI: 10.4230/LIPIcs.FSCD.2022.28
We show how confluence criteria based on decreasing diagrams are generalized to ones composable with other criteria. For demonstration of the method, the confluence criteria of orthogonality, rule labeling, and critical pair systems for term rewriting are recast into composable forms. We also show how such a criterion can be used for a reduction method that removes rewrite rules unnecessary for confluence analysis. In addition to them, we prove that Toyama's parallel closedness result based on parallel critical pairs subsumes his almost parallel closedness theorem.
我们展示了如何将基于递减图的合流准则推广到与其他准则可组合的合流准则。为了演示该方法,将正交性合流准则、规则标记和术语重写的关键对系统重新转换为可组合的形式。我们还展示了如何将这样的标准用于减少方法,该方法删除了合流分析中不必要的重写规则。除此之外,我们还证明了Toyama基于平行临界对的平行闭性结果包含了他的几乎平行闭性定理。
{"title":"Compositional Confluence Criteria","authors":"Kiraku Shintani, Nao Hirokawa","doi":"10.4230/LIPIcs.FSCD.2022.28","DOIUrl":"https://doi.org/10.4230/LIPIcs.FSCD.2022.28","url":null,"abstract":"We show how confluence criteria based on decreasing diagrams are generalized to ones composable with other criteria. For demonstration of the method, the confluence criteria of orthogonality, rule labeling, and critical pair systems for term rewriting are recast into composable forms. We also show how such a criterion can be used for a reduction method that removes rewrite rules unnecessary for confluence analysis. In addition to them, we prove that Toyama's parallel closedness result based on parallel critical pairs subsumes his almost parallel closedness theorem.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123675126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
An Analysis of Tennenbaum's Theorem in Constructive Type Theory 建构型理论中的Tennenbaum定理分析
Pub Date : 2023-02-28 DOI: 10.4230/LIPIcs.FSCD.2022.9
Tennenbaum's theorem states that the only countable model of Peano arithmetic (PA) with computable arithmetical operations is the standard model of natural numbers. In this paper, we use constructive type theory as a framework to revisit, analyze and generalize this result. The chosen framework allows for a synthetic approach to computability theory, exploiting that, externally, all functions definable in constructive type theory can be shown computable. We then build on this viewpoint and furthermore internalize it by assuming a version of Church's thesis, which expresses that any function on natural numbers is representable by a formula in PA. This assumption provides for a conveniently abstract setup to carry out rigorous computability arguments, even in the theorem's mechanization. Concretely, we constructivize several classical proofs and present one inherently constructive rendering of Tennenbaum's theorem, all following arguments from the literature. Concerning the classical proofs in particular, the constructive setting allows us to highlight differences in their assumptions and conclusions which are not visible classically. All versions are accompanied by a unified mechanization in the Coq proof assistant.
Tennenbaum定理指出,具有可计算算术运算的Peano算术(PA)的唯一可数模型是自然数的标准模型。本文以建构型理论为框架,对这一结果进行了回顾、分析和推广。所选择的框架允许可计算性理论的综合方法,利用,在外部,所有在构造型理论中可定义的函数都可以显示为可计算的。然后,我们建立在这个观点的基础上,并进一步内化它,通过假设丘奇的论文的一个版本,它表示自然数上的任何函数都可以用PA中的公式表示。这个假设提供了一个方便的抽象设置来执行严格的可计算性论证,甚至在定理的机械化中也是如此。具体地说,我们构造了几个经典的证明,并提出了一个固有的建设性的Tennenbaum定理,所有这些都是来自文献的论点。特别是关于经典证明,建设性的设置使我们能够突出他们的假设和结论的差异,这些差异在经典中是不可见的。所有版本都配有统一机械化的Coq打样助手。
{"title":"An Analysis of Tennenbaum's Theorem in Constructive Type Theory","authors":"Marc Hermes, Dominik Kirst","doi":"10.4230/LIPIcs.FSCD.2022.9","DOIUrl":"https://doi.org/10.4230/LIPIcs.FSCD.2022.9","url":null,"abstract":"Tennenbaum's theorem states that the only countable model of Peano arithmetic (PA) with computable arithmetical operations is the standard model of natural numbers. In this paper, we use constructive type theory as a framework to revisit, analyze and generalize this result. The chosen framework allows for a synthetic approach to computability theory, exploiting that, externally, all functions definable in constructive type theory can be shown computable. We then build on this viewpoint and furthermore internalize it by assuming a version of Church's thesis, which expresses that any function on natural numbers is representable by a formula in PA. This assumption provides for a conveniently abstract setup to carry out rigorous computability arguments, even in the theorem's mechanization. Concretely, we constructivize several classical proofs and present one inherently constructive rendering of Tennenbaum's theorem, all following arguments from the literature. Concerning the classical proofs in particular, the constructive setting allows us to highlight differences in their assumptions and conclusions which are not visible classically. All versions are accompanied by a unified mechanization in the Coq proof assistant.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114739925","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Rewriting modulo traced comonoid structure 改写模迹共子体结构
Pub Date : 2023-02-19 DOI: 10.48550/arXiv.2302.09631
In this paper we adapt previous work on rewriting string diagrams using hypergraphs to the case where the underlying category has a traced comonoid structure, in which wires can be forked and the outputs of a morphism can be connected to its input. Such a structure is particularly interesting because any traced Cartesian (dataflow) category has an underlying traced comonoid structure. We show that certain subclasses of hypergraphs are fully complete for traced comonoid categories: that is to say, every term in such a category has a unique corresponding hypergraph up to isomorphism, and from every hypergraph with the desired properties, a unique term in the category can be retrieved up to the axioms of traced comonoid categories. We also show how the framework of double pushout rewriting (DPO) can be adapted for traced comonoid categories by characterising the valid pushout complements for rewriting in our setting. We conclude by presenting a case study in the form of recent work on an equational theory for sequential circuits: circuits built from primitive logic gates with delay and feedback. The graph rewriting framework allows for the definition of an operational semantics for sequential circuits.
在本文中,我们将先前使用超图重写字符串图的工作改编为下面的范畴具有可跟踪的共子体结构的情况,在这种情况下,导线可以分叉,并且态射的输出可以连接到它的输入。这样的结构特别有趣,因为任何可跟踪的笛卡尔(数据流)类别都有一个潜在的可跟踪共形结构。我们证明了超图的某些子类对于可迹共子类是完全完备的:即在这样一个范畴内的每一项都有一个唯一的对应到同构的超图,并且从每一个具有期望性质的超图中,可以检索到该范畴内的唯一项,直至可迹共子类的公理。我们还展示了双推出重写(DPO)的框架如何通过描述在我们的设置中用于重写的有效推出补语来适应跟踪的共类。最后,我们以最近对顺序电路的方程理论的工作形式提出了一个案例研究:由具有延迟和反馈的原始逻辑门构建的电路。图形重写框架允许为顺序电路定义操作语义。
{"title":"Rewriting modulo traced comonoid structure","authors":"D. Ghica, G. Kaye","doi":"10.48550/arXiv.2302.09631","DOIUrl":"https://doi.org/10.48550/arXiv.2302.09631","url":null,"abstract":"In this paper we adapt previous work on rewriting string diagrams using hypergraphs to the case where the underlying category has a traced comonoid structure, in which wires can be forked and the outputs of a morphism can be connected to its input. Such a structure is particularly interesting because any traced Cartesian (dataflow) category has an underlying traced comonoid structure. We show that certain subclasses of hypergraphs are fully complete for traced comonoid categories: that is to say, every term in such a category has a unique corresponding hypergraph up to isomorphism, and from every hypergraph with the desired properties, a unique term in the category can be retrieved up to the axioms of traced comonoid categories. We also show how the framework of double pushout rewriting (DPO) can be adapted for traced comonoid categories by characterising the valid pushout complements for rewriting in our setting. We conclude by presenting a case study in the form of recent work on an equational theory for sequential circuits: circuits built from primitive logic gates with delay and feedback. The graph rewriting framework allows for the definition of an operational semantics for sequential circuits.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"94 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131395704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
E-unification for Second-Order Abstract Syntax 二阶抽象句法的e统一
Pub Date : 2023-02-11 DOI: 10.48550/arXiv.2302.05815
Higher-order unification (HOU) concerns unification of (extensions of) $lambda$-calculus and can be seen as an instance of equational unification ($E$-unification) modulo $betaeta$-equivalence of $lambda$-terms. We study equational unification of terms in languages with arbitrary variable binding constructions modulo arbitrary second-order equational theories. Abstract syntax with general variable binding and parametrised metavariables allows us to work with arbitrary binders without committing to $lambda$-calculus or use inconvenient and error-prone term encodings, leading to a more flexible framework. In this paper, we introduce $E$-unification for second-order abstract syntax and describe a unification procedure for such problems, merging ideas from both full HOU and general $E$-unification. We prove that the procedure is sound and complete.
高阶统一(HOU)涉及$lambda$ -微积分的统一(扩展),可以看作是方程统一($E$ -统一)模$betaeta$ - $lambda$ -项等价的一个实例。利用任意二阶方程理论,研究了具有任意变量绑定结构的语言项的方程统一问题。具有一般变量绑定和参数化元变量的抽象语法允许我们使用任意绑定器,而无需提交$lambda$ -演算或使用不方便且容易出错的术语编码,从而产生更灵活的框架。在本文中,我们引入了二阶抽象语法的$E$ -统一,并描述了这类问题的统一过程,融合了完全HOU和一般$E$ -统一的思想。我们证明该程序是健全和完整的。
{"title":"E-unification for Second-Order Abstract Syntax","authors":"Nikolai Kudasov","doi":"10.48550/arXiv.2302.05815","DOIUrl":"https://doi.org/10.48550/arXiv.2302.05815","url":null,"abstract":"Higher-order unification (HOU) concerns unification of (extensions of) $lambda$-calculus and can be seen as an instance of equational unification ($E$-unification) modulo $betaeta$-equivalence of $lambda$-terms. We study equational unification of terms in languages with arbitrary variable binding constructions modulo arbitrary second-order equational theories. Abstract syntax with general variable binding and parametrised metavariables allows us to work with arbitrary binders without committing to $lambda$-calculus or use inconvenient and error-prone term encodings, leading to a more flexible framework. In this paper, we introduce $E$-unification for second-order abstract syntax and describe a unification procedure for such problems, merging ideas from both full HOU and general $E$-unification. We prove that the procedure is sound and complete.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127801453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On the Lattice of Program Metrics 关于程序度量的格
Pub Date : 2023-02-10 DOI: 10.48550/arXiv.2302.05022
In this paper we are concerned with understanding the nature of program metrics for calculi with higher-order types, seen as natural generalizations of program equivalences. Some of the metrics we are interested in are well-known, such as those based on the interpretation of terms in metric spaces and those obtained by generalizing observational equivalence. We also introduce a new one, called the interactive metric, built by applying the well-known Int-Construction to the category of metric complete partial orders. Our aim is then to understand how these metrics relate to each other, i.e., whether and in which cases one such metric refines another, in analogy with corresponding well-studied problems about program equivalences. The results we obtain are twofold. We first show that the metrics of semantic origin, i.e., the denotational and interactive ones, lie emph{in between} the observational and equational metrics and that in some cases, these inclusions are strict. Then, we give a result about the relationship between the denotational and interactive metrics, revealing that the former is less discriminating than the latter. All our results are given for a linear lambda-calculus, and some of them can be generalized to calculi with graded comonads, in the style of Fuzz.
在本文中,我们关注的是理解具有高阶类型的微积分的程序度量的本质,它被看作是程序等价的自然推广。我们感兴趣的一些度量是众所周知的,例如那些基于度量空间中项的解释的度量和那些通过推广观测等价得到的度量。我们还引入了一种新的度量,称为交互式度量,它是通过将众所周知的int构造应用于度量完全偏序的范畴而建立的。然后,我们的目标是理解这些度量是如何相互关联的,也就是说,是否以及在哪种情况下,一个这样的度量可以细化另一个,类比于相应的关于程序等价的充分研究的问题。我们得到的结果是双重的。我们首先证明了语义起源的度量,即指称度量和交互度量,位于观测度量和等式度量emph{之间},并且在某些情况下,这些包含是严格的。然后,我们给出了指称度量和互动度量之间关系的结果,揭示了前者比后者更具歧视性。我们所有的结果都是关于线性λ演算的,其中一些结果可以推广到具有梯度公数的fuzzy类型的演算。
{"title":"On the Lattice of Program Metrics","authors":"Ugo Dal Lago, Naohiko Hoshino, Paolo Pistone","doi":"10.48550/arXiv.2302.05022","DOIUrl":"https://doi.org/10.48550/arXiv.2302.05022","url":null,"abstract":"In this paper we are concerned with understanding the nature of program metrics for calculi with higher-order types, seen as natural generalizations of program equivalences. Some of the metrics we are interested in are well-known, such as those based on the interpretation of terms in metric spaces and those obtained by generalizing observational equivalence. We also introduce a new one, called the interactive metric, built by applying the well-known Int-Construction to the category of metric complete partial orders. Our aim is then to understand how these metrics relate to each other, i.e., whether and in which cases one such metric refines another, in analogy with corresponding well-studied problems about program equivalences. The results we obtain are twofold. We first show that the metrics of semantic origin, i.e., the denotational and interactive ones, lie emph{in between} the observational and equational metrics and that in some cases, these inclusions are strict. Then, we give a result about the relationship between the denotational and interactive metrics, revealing that the former is less discriminating than the latter. All our results are given for a linear lambda-calculus, and some of them can be generalized to calculi with graded comonads, in the style of Fuzz.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"12 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120911172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
For the Metatheory of Type Theory, Internal Sconing Is Enough 对于类型理论的元理论来说,内部搜索就足够了
Pub Date : 2023-02-10 DOI: 10.48550/arXiv.2302.05190
Metatheorems about type theories are often proven by interpreting the syntax into models constructed using categorical gluing. We propose to use only sconing (gluing along a global section functor) instead of general gluing. The sconing is performed internally to a presheaf category, and we recover the original glued model by externalization. Our method relies on constructions involving two notions of models: first-order models (with explicit contexts) and higher-order models (without explicit contexts). Sconing turns a displayed higher-order model into a displayed first-order model. Using these, we derive specialized induction principles for the syntax of type theory. The input of such an induction principle is a boilerplate-free description of its motives and methods, not mentioning contexts. The output is a section with computation rules specified in the same internal language. We illustrate our framework by proofs of canonicity, normalization and syntactic parametricity for type theory.
关于类型理论的元定理通常通过将语法解释为使用范畴粘合构造的模型来证明。我们建议只使用sconing(沿着一个全局section函子粘合)而不是一般的粘合。在内部对预层类别进行扫描,并通过外化恢复原始胶合模型。我们的方法依赖于涉及两个模型概念的结构:一阶模型(具有显式上下文)和高阶模型(没有显式上下文)。scoing将显示的高阶模型转换为显示的一阶模型。利用这些,我们为类型论的语法导出了专门的归纳原则。这种归纳原则的输入是对其动机和方法的无模板描述,而不提及上下文。输出是一个用相同内部语言指定计算规则的部分。我们通过对类型理论的正则性、规范化和句法参数性的证明来说明我们的框架。
{"title":"For the Metatheory of Type Theory, Internal Sconing Is Enough","authors":"Rafael Bocquet, A. Kaposi, Christian Sattler","doi":"10.48550/arXiv.2302.05190","DOIUrl":"https://doi.org/10.48550/arXiv.2302.05190","url":null,"abstract":"Metatheorems about type theories are often proven by interpreting the syntax into models constructed using categorical gluing. We propose to use only sconing (gluing along a global section functor) instead of general gluing. The sconing is performed internally to a presheaf category, and we recover the original glued model by externalization. Our method relies on constructions involving two notions of models: first-order models (with explicit contexts) and higher-order models (without explicit contexts). Sconing turns a displayed higher-order model into a displayed first-order model. Using these, we derive specialized induction principles for the syntax of type theory. The input of such an induction principle is a boilerplate-free description of its motives and methods, not mentioning contexts. The output is a section with computation rules specified in the same internal language. We illustrate our framework by proofs of canonicity, normalization and syntactic parametricity for type theory.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132415946","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Strategies as Resource Terms, and their Categorical Semantics 策略作为资源术语及其范畴语义
Pub Date : 2023-02-09 DOI: 10.48550/arXiv.2302.04685
As shown by Tsukada and Ong, normal (extensional) simply-typed resource terms correspond to plays in Hyland-Ong games, quotiented by Melli`es' homotopy equivalence. Though inspiring, their proof is indirect, relying on the injectivity of the relational model w.r.t. both sides of the correspondence - in particular, the dynamics of the resource calculus is taken into account only via the compatibility of the relational model with the composition of normal terms defined by normalization. In the present paper, we revisit and extend these results. Our first contribution is to restate the correspondence by considering causal structures we call augmentations, which are canonical representatives of Hyland-Ong plays up to homotopy. This allows us to give a direct and explicit account of the connection with normal resource terms. As a second contribution, we extend this account to the reduction of resource terms: building on a notion of strategies as weighted sums of augmentations, we provide a denotational model of the resource calculus, invariant under reduction. A key step - and our third contribution - is a categorical model we call a resource category, which is to the resource calculus what differential categories are to the differential {lambda}-calculus.
正如Tsukada和Ong所示,正常(外延)简单类型资源项对应于Hyland-Ong游戏中的玩法,由Melli ' es'同伦等价引用。虽然鼓舞人心,但他们的证明是间接的,依赖于关系模型的注入性,而不是对应的双方-特别是,资源演算的动态仅通过关系模型与标准化定义的正常项组成的兼容性来考虑。在本文中,我们重新审视并扩展了这些结果。我们的第一个贡献是通过考虑我们称为增广的因果结构来重申对应关系,这是Hyland-Ong达到同伦的典型代表。这使我们能够直接和显式地描述与正常资源术语的连接。作为第二个贡献,我们将这一账户扩展到资源术语的减少:建立在作为增广加权和的策略概念的基础上,我们提供了资源演算的指称模型,在减少下不变。关键的一步——也是我们的第三个贡献——是一个我们称之为资源类别的分类模型,它对于资源演算就像微分类别对于微分演算一样重要。
{"title":"Strategies as Resource Terms, and their Categorical Semantics","authors":"Lison Blondeau-Patissier, P. Clairambault, Lionel Vaux Auclair","doi":"10.48550/arXiv.2302.04685","DOIUrl":"https://doi.org/10.48550/arXiv.2302.04685","url":null,"abstract":"As shown by Tsukada and Ong, normal (extensional) simply-typed resource terms correspond to plays in Hyland-Ong games, quotiented by Melli`es' homotopy equivalence. Though inspiring, their proof is indirect, relying on the injectivity of the relational model w.r.t. both sides of the correspondence - in particular, the dynamics of the resource calculus is taken into account only via the compatibility of the relational model with the composition of normal terms defined by normalization. In the present paper, we revisit and extend these results. Our first contribution is to restate the correspondence by considering causal structures we call augmentations, which are canonical representatives of Hyland-Ong plays up to homotopy. This allows us to give a direct and explicit account of the connection with normal resource terms. As a second contribution, we extend this account to the reduction of resource terms: building on a notion of strategies as weighted sums of augmentations, we provide a denotational model of the resource calculus, invariant under reduction. A key step - and our third contribution - is a categorical model we call a resource category, which is to the resource calculus what differential categories are to the differential {lambda}-calculus.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133249348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Formal Theory of Monads, Univalently 一元的形式理论
Pub Date : 2022-12-16 DOI: 10.48550/arXiv.2212.08515
We develop the formal theory of monads, as established by Street, in univalent foundations. This allows us to formally reason about various kinds of monads on the right level of abstraction. In particular, we define the bicategory of monads internal to a bicategory, and prove that it is univalent. We also define Eilenberg-Moore objects, and we show that both Eilenberg-Moore categories and Kleisli categories give rise to Eilenberg-Moore objects. Finally, we relate monads and adjunctions in arbitrary bicategories. Our work is formalized in Coq using the UniMath library.
我们在一元基础上发展了斯特里特所建立的单子的形式理论。这允许我们在适当的抽象层次上正式地推理各种单子。特别地,我们定义了双范畴内部单子的双范畴,并证明了它是一元的。我们还定义了Eilenberg-Moore对象,并证明了Eilenberg-Moore范畴和Kleisli范畴都会产生Eilenberg-Moore对象。最后,我们将任意双范畴中的单复数和连词联系起来。我们的工作是使用UniMath库在Coq中形式化的。
{"title":"The Formal Theory of Monads, Univalently","authors":"N. V. D. Weide","doi":"10.48550/arXiv.2212.08515","DOIUrl":"https://doi.org/10.48550/arXiv.2212.08515","url":null,"abstract":"We develop the formal theory of monads, as established by Street, in univalent foundations. This allows us to formally reason about various kinds of monads on the right level of abstraction. In particular, we define the bicategory of monads internal to a bicategory, and prove that it is univalent. We also define Eilenberg-Moore objects, and we show that both Eilenberg-Moore categories and Kleisli categories give rise to Eilenberg-Moore objects. Finally, we relate monads and adjunctions in arbitrary bicategories. Our work is formalized in Coq using the UniMath library.","PeriodicalId":284975,"journal":{"name":"International Conference on Formal Structures for Computation and Deduction","volume":"12 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131754047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
International Conference on Formal Structures for Computation and Deduction
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1