{"title":"Editorial DEON 2020/2021 Special Issue","authors":"","doi":"10.1093/logcom/exad073","DOIUrl":"https://doi.org/10.1093/logcom/exad073","url":null,"abstract":"","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"35 9","pages":""},"PeriodicalIF":0.7,"publicationDate":"2023-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139007577","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sir William Hamilton is remembered for his proposal to extend the four traditional categoricals to eight by quantifying predicate as well as subject terms. He intended the quantifying particles to be understood in a ‘collective’ or ‘cumular’ manner rather than in a ‘distributive’ or ‘exemplar’ one, but commentators from De Morgan onwards have worked primarily from the latter perspective, comforted in the 20th century by the fact that it translates readily into the language of first-order logic with identity. Formal representation of the cumular approach needs more sophisticated resources, and the paper shows how it may be carried out using selection functions in the language of third-order logic. It also reviews a number of variants, some equivalent and others not so, as well as their reductions to second-order logic, and situates historical sources, both before and after Hamilton, with respect to the web of formal constructions.
{"title":"Hamilton’s cumular conception of quantifying particles: an exercise in third-order logic","authors":"David Makinson","doi":"10.1093/logcom/exad072","DOIUrl":"https://doi.org/10.1093/logcom/exad072","url":null,"abstract":"Sir William Hamilton is remembered for his proposal to extend the four traditional categoricals to eight by quantifying predicate as well as subject terms. He intended the quantifying particles to be understood in a ‘collective’ or ‘cumular’ manner rather than in a ‘distributive’ or ‘exemplar’ one, but commentators from De Morgan onwards have worked primarily from the latter perspective, comforted in the 20th century by the fact that it translates readily into the language of first-order logic with identity. Formal representation of the cumular approach needs more sophisticated resources, and the paper shows how it may be carried out using selection functions in the language of third-order logic. It also reviews a number of variants, some equivalent and others not so, as well as their reductions to second-order logic, and situates historical sources, both before and after Hamilton, with respect to the web of formal constructions.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":" 20","pages":""},"PeriodicalIF":0.7,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138492997","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present a comprehensive programme analysing the decomposition of proof systems for non-classical logics into proof systems for other logics, especially classical logic, using an algebra of constraints. That is, one recovers a proof system for a target logic by enriching a proof system for another, typically simpler, logic with an algebra of constraints that act as correctness conditions on the latter to capture the former; e.g. one may use Boolean algebra to give constraints in a sequent calculus for classical propositional logic to produce a sequent calculus for intuitionistic propositional logic. The idea behind such forms of decomposition is to obtain a tool for uniform and modular treatment of proof theory and to provide a bridge between semantics logics and their proof theory. The paper discusses the theoretical background of the project and provides several illustrations of its work in the field of intuitionistic and modal logics: including, a uniform treatment of modular and cut-free proof systems for a large class of propositional logics; a general criterion for a novel approach to soundness and completeness of a logic with respect to a model-theoretic semantics; and a case study deriving a model-theoretic semantics from a proof-theoretic specification of a logic.
{"title":"Defining Logical Systems via Algebraic Constraints on Proofs","authors":"Alexander V Gheorghiu, David J Pym","doi":"10.1093/logcom/exad065","DOIUrl":"https://doi.org/10.1093/logcom/exad065","url":null,"abstract":"We present a comprehensive programme analysing the decomposition of proof systems for non-classical logics into proof systems for other logics, especially classical logic, using an algebra of constraints. That is, one recovers a proof system for a target logic by enriching a proof system for another, typically simpler, logic with an algebra of constraints that act as correctness conditions on the latter to capture the former; e.g. one may use Boolean algebra to give constraints in a sequent calculus for classical propositional logic to produce a sequent calculus for intuitionistic propositional logic. The idea behind such forms of decomposition is to obtain a tool for uniform and modular treatment of proof theory and to provide a bridge between semantics logics and their proof theory. The paper discusses the theoretical background of the project and provides several illustrations of its work in the field of intuitionistic and modal logics: including, a uniform treatment of modular and cut-free proof systems for a large class of propositional logics; a general criterion for a novel approach to soundness and completeness of a logic with respect to a model-theoretic semantics; and a case study deriving a model-theoretic semantics from a proof-theoretic specification of a logic.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"9 1","pages":""},"PeriodicalIF":0.7,"publicationDate":"2023-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138517193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tiago M L de Veras, Arthur F Ramos, Ruy J G B de Queiroz, Anjolina G de Oliveira
On the basis of a labelled deduction system (LND$_{ED-}$TRS), we demonstrate how to formalize the concept of computational paths (sequences of rewrites) as equalities between two terms of the same type. This has allowed us to carry out a formal counterpart to equality between paths which is dealt with in homotopy theory, but this time with an approach using the device of term-rewriting paths. Using such formal calculus dealing with paths, we construct the fundamental groupoid of a path-connected $ X $ type and we define the concept of isomorphism between types. Next, we show that the computational paths determine a weak category, which will be called $ mathcal {C}_{paths} $. Finally, we show that the weak category $ mathcal {C}_{paths} $ determines a weak groupoid.
{"title":"Computational paths - a weak groupoid","authors":"Tiago M L de Veras, Arthur F Ramos, Ruy J G B de Queiroz, Anjolina G de Oliveira","doi":"10.1093/logcom/exad071","DOIUrl":"https://doi.org/10.1093/logcom/exad071","url":null,"abstract":"On the basis of a labelled deduction system (LND$_{ED-}$TRS), we demonstrate how to formalize the concept of computational paths (sequences of rewrites) as equalities between two terms of the same type. This has allowed us to carry out a formal counterpart to equality between paths which is dealt with in homotopy theory, but this time with an approach using the device of term-rewriting paths. Using such formal calculus dealing with paths, we construct the fundamental groupoid of a path-connected $ X $ type and we define the concept of isomorphism between types. Next, we show that the computational paths determine a weak category, which will be called $ mathcal {C}_{paths} $. Finally, we show that the weak category $ mathcal {C}_{paths} $ determines a weak groupoid.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"1987 6","pages":""},"PeriodicalIF":0.7,"publicationDate":"2023-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
By making use of three IBM case studies involving the author and colleagues, this paper is about applying theory to practice. In the first case study, the system builders (or practitioners) initiated the interaction. This interaction led to the following problem. Assume that there is a set of objects, each with multiple attributes, and there is a numerical score assigned to each attribute of each object. In the spirit of real-valued logics, there is a scoring function (such as the min or the average), and a ranking of the objects is obtained by applying the scoring function to the scores of each object’s attributes The problem is to find the top $k$ objects, while minimizing the number of database accesses. An algorithm is given that is optimal in an extremely strong sense: not just in the worst case or the average case, but (up to a constant factor) in every case! Even though the algorithm is only 8 lines long (!), the paper containing the algorithm won the 2014 Gödel Prize, the top prize for a paper in theoretical computer science. The interaction in the second case study was initiated by theoreticians, who wanted to lay the foundations for ‘data exchange’, in which data is converted from one format to another. Although this problem may sound mundane, the issues that arise are fascinating, and this work made data exchange a new subfield, with special sessions in every major database conference. This work won the 2020 Alonzo Church Award, the highest prize for research in logic and computation. The third case study, specifically on real-valued (or ‘fuzzy’) logic, arose as part of a large ‘Logical Neural Nets’ (LNN) project at IBM. The inputs to, say, an ‘and’ gate could each be any numbers in the interval [0,1]. The system builders of LNN wanted a sound and complete axiomatization for real-valued logic, so that they could arrive at truth values given other truth values whenever possible. This recent work provides a sound and complete axiomatization for a large class of real-valued logics, including the most common ones. It also allows weights, where the importance of some subformulas can be greater than that of other subformulas. This paper is aimed at both theoreticians and system builders, to show them the mutual benefits of working together. This is via the three case studies mentioned above: two initiated by the system builders, and one by the theoreticians. The moral for the theoreticians is to show by example how to apply theory to practice, and why applying theory to practice can lead to better theory. The moral for the system builders is the value of theory, and the value of involving theoreticians. This paper is written in a very informal style. In fact, it is based closely on a talk on ‘Applying theory to practice’ that the author has presented a number of times.
{"title":"Applying Theory to Practice","authors":"Ronald Fagin","doi":"10.1093/logcom/exad066","DOIUrl":"https://doi.org/10.1093/logcom/exad066","url":null,"abstract":"By making use of three IBM case studies involving the author and colleagues, this paper is about applying theory to practice. In the first case study, the system builders (or practitioners) initiated the interaction. This interaction led to the following problem. Assume that there is a set of objects, each with multiple attributes, and there is a numerical score assigned to each attribute of each object. In the spirit of real-valued logics, there is a scoring function (such as the min or the average), and a ranking of the objects is obtained by applying the scoring function to the scores of each object’s attributes The problem is to find the top $k$ objects, while minimizing the number of database accesses. An algorithm is given that is optimal in an extremely strong sense: not just in the worst case or the average case, but (up to a constant factor) in every case! Even though the algorithm is only 8 lines long (!), the paper containing the algorithm won the 2014 Gödel Prize, the top prize for a paper in theoretical computer science. The interaction in the second case study was initiated by theoreticians, who wanted to lay the foundations for ‘data exchange’, in which data is converted from one format to another. Although this problem may sound mundane, the issues that arise are fascinating, and this work made data exchange a new subfield, with special sessions in every major database conference. This work won the 2020 Alonzo Church Award, the highest prize for research in logic and computation. The third case study, specifically on real-valued (or ‘fuzzy’) logic, arose as part of a large ‘Logical Neural Nets’ (LNN) project at IBM. The inputs to, say, an ‘and’ gate could each be any numbers in the interval [0,1]. The system builders of LNN wanted a sound and complete axiomatization for real-valued logic, so that they could arrive at truth values given other truth values whenever possible. This recent work provides a sound and complete axiomatization for a large class of real-valued logics, including the most common ones. It also allows weights, where the importance of some subformulas can be greater than that of other subformulas. This paper is aimed at both theoreticians and system builders, to show them the mutual benefits of working together. This is via the three case studies mentioned above: two initiated by the system builders, and one by the theoreticians. The moral for the theoreticians is to show by example how to apply theory to practice, and why applying theory to practice can lead to better theory. The moral for the system builders is the value of theory, and the value of involving theoreticians. This paper is written in a very informal style. In fact, it is based closely on a talk on ‘Applying theory to practice’ that the author has presented a number of times.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"1987 7","pages":""},"PeriodicalIF":0.7,"publicationDate":"2023-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Correction to: Minimal degrees and downwards density in some strong positive reducibilities and quasi-reducibilities","authors":"","doi":"10.1093/logcom/exad069","DOIUrl":"https://doi.org/10.1093/logcom/exad069","url":null,"abstract":"","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"29 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134909474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abstract In this article, we define and study the notion of a $(c,c_{1})$-cylinder, which turns out to be very useful instrument for investigating the relationships between conjunctive reducibility ($c$-reducibility) and its injective version $c_{1}$-reducibility. Using this notion, we prove the following results: (i) Neither hypersimple sets nor hemimaximal sets can be $(c,c_{1})$-cylinders; (ii) The $c$-degree of a noncomputable c.e. set contains either only one or infinitely many noncomputable $c_{1}$-degrees; (iii) the $c$-degree of either a hemimaximal set or a hypersimple set contains infinitely many noncomputable $c_{1}$-degrees.
{"title":"Conjunctive degrees and cylinders","authors":"Irakli Chitaia, Roland Omanadze, Andrea Sorbi","doi":"10.1093/logcom/exad064","DOIUrl":"https://doi.org/10.1093/logcom/exad064","url":null,"abstract":"Abstract In this article, we define and study the notion of a $(c,c_{1})$-cylinder, which turns out to be very useful instrument for investigating the relationships between conjunctive reducibility ($c$-reducibility) and its injective version $c_{1}$-reducibility. Using this notion, we prove the following results: (i) Neither hypersimple sets nor hemimaximal sets can be $(c,c_{1})$-cylinders; (ii) The $c$-degree of a noncomputable c.e. set contains either only one or infinitely many noncomputable $c_{1}$-degrees; (iii) the $c$-degree of either a hemimaximal set or a hypersimple set contains infinitely many noncomputable $c_{1}$-degrees.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"72 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135167797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abstract Argumentation must be conducted within specific contexts that involve particular social norms and values. For decision-making, the divergence of opinions among participants does not lie solely in disagreements of common sense and beliefs but mainly stems from differences in the priority orderings over values. In this paper, we discuss how to build consensus among participants holding different value orderings based on an extended structured argumentation framework that takes contextual factors into account. Compared with other formal systems for multi-agent reasoning based on argumentation, the context-based argumentation system especially emphasizes the dynamic nature of contexts. In addition, we refer to a pragmatic perspective to discuss how people manage to achieve the consensus they expect by changing contexts during an argument.
{"title":"Context-based argumentation frameworks and multi-agent consensus building","authors":"Zhe Yu, Shier Ju, Weiwei Chen","doi":"10.1093/logcom/exad063","DOIUrl":"https://doi.org/10.1093/logcom/exad063","url":null,"abstract":"Abstract Argumentation must be conducted within specific contexts that involve particular social norms and values. For decision-making, the divergence of opinions among participants does not lie solely in disagreements of common sense and beliefs but mainly stems from differences in the priority orderings over values. In this paper, we discuss how to build consensus among participants holding different value orderings based on an extended structured argumentation framework that takes contextual factors into account. Compared with other formal systems for multi-agent reasoning based on argumentation, the context-based argumentation system especially emphasizes the dynamic nature of contexts. In addition, we refer to a pragmatic perspective to discuss how people manage to achieve the consensus they expect by changing contexts during an argument.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135780015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bart Jacobs, Bram Westerbaana, Omar Javed, Harm van Stekelenburg, Lian Vervoort, Jan den Besten
Abstract Finding a combination between privacy and accountability in the online world is a challenge. Too little accountability supports problematic behaviour. Too little privacy undermines individual freedom and has a chilling effect. This paper describes the identity infrastructure of a new open source community platform called PubHubs. It combines local group conversations, via its own adaptation of Matrix, with proportional authentication of users, in local identity spaces. PubHubs thus achieves a combination of privacy and accountability. The technical core of the paper describes the cryptographic protocols for managing digital identities, via both personal attributes and local pseudonyms. Around this core, the roles of digital identities within the PubHubs platform are explained in functional terms, giving participants for instance more certainty about others in a conversation, or giving moderators new tools.
{"title":"PubHubs identity management","authors":"Bart Jacobs, Bram Westerbaana, Omar Javed, Harm van Stekelenburg, Lian Vervoort, Jan den Besten","doi":"10.1093/logcom/exad062","DOIUrl":"https://doi.org/10.1093/logcom/exad062","url":null,"abstract":"Abstract Finding a combination between privacy and accountability in the online world is a challenge. Too little accountability supports problematic behaviour. Too little privacy undermines individual freedom and has a chilling effect. This paper describes the identity infrastructure of a new open source community platform called PubHubs. It combines local group conversations, via its own adaptation of Matrix, with proportional authentication of users, in local identity spaces. PubHubs thus achieves a combination of privacy and accountability. The technical core of the paper describes the cryptographic protocols for managing digital identities, via both personal attributes and local pseudonyms. Around this core, the roles of digital identities within the PubHubs platform are explained in functional terms, giving participants for instance more certainty about others in a conversation, or giving moderators new tools.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136181880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abstract In this work, we present polyhedral semantics as a means to tractably approximate Łukasiewicz infinitely-valued logic (Ł$_{infty}$). As Ł$_{infty}$ is an expressive multivalued propositional logic whose decision problem is NP-complete, we show how to to obtain an approximation for this problem providing a family of multivalued logics over the same language as Ł$_{infty}$. Each element of the family is associated to a polynomial-time linear program, thus providing a tractable way of deciding each intermediate step. We also investigate properties of the logic system derived from polyhedral semantics and the details of an algorithm for the approximation process.
{"title":"Polyhedral semantics and the tractable approximation of Łukasiewicz infinitely-valued logic","authors":"Marcelo Finger, Sandro Preto","doi":"10.1093/logcom/exad059","DOIUrl":"https://doi.org/10.1093/logcom/exad059","url":null,"abstract":"Abstract In this work, we present polyhedral semantics as a means to tractably approximate Łukasiewicz infinitely-valued logic (Ł$_{infty}$). As Ł$_{infty}$ is an expressive multivalued propositional logic whose decision problem is NP-complete, we show how to to obtain an approximation for this problem providing a family of multivalued logics over the same language as Ł$_{infty}$. Each element of the family is associated to a polynomial-time linear program, thus providing a tractable way of deciding each intermediate step. We also investigate properties of the logic system derived from polyhedral semantics and the details of an algorithm for the approximation process.","PeriodicalId":50162,"journal":{"name":"Journal of Logic and Computation","volume":"107 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135855847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}