Logical frameworks and meta-languages form a common substrate for representing, implementing and reasoning about a wide variety of deductive systems of interest in logic and computer science. Their design, implementation and their use in reasoning tasks, ranging from the correctness of software to the properties of formal systems, have been the focus of considerable research over the last three decades. The LFMTP workshop brought together designers, implementors and practitioners to discuss various aspects impinging on the structure and utility of logical frameworks, including the treatment of variable binding, inductive and co-inductive reasoning techniques and the expressiveness and lucidity of the reasoning process. The 2024 instance of LFMTP was organized by Florian Rabe and Claudio Sacerdoti Coen in Tallinn, Estonia, the 8th July, as a satellite event of the FSCD conference. The workshop received 8 submissions, of which 6 were presented at the workshop. Of these, 2 were work-in-progress presentations, and 4 were accepted for these formal proceedings. Additionally, Carsten Sch"urmann of IT University of Copenhagen gave an invited talk on Nominal State Separating Proofs.
{"title":"Proceedings Workshop on Logical Frameworks and Meta-Languages: Theory and Practice","authors":"Florian Rabe, C. Coen","doi":"10.4204/EPTCS.404","DOIUrl":"https://doi.org/10.4204/EPTCS.404","url":null,"abstract":"Logical frameworks and meta-languages form a common substrate for representing, implementing and reasoning about a wide variety of deductive systems of interest in logic and computer science. Their design, implementation and their use in reasoning tasks, ranging from the correctness of software to the properties of formal systems, have been the focus of considerable research over the last three decades. The LFMTP workshop brought together designers, implementors and practitioners to discuss various aspects impinging on the structure and utility of logical frameworks, including the treatment of variable binding, inductive and co-inductive reasoning techniques and the expressiveness and lucidity of the reasoning process. The 2024 instance of LFMTP was organized by Florian Rabe and Claudio Sacerdoti Coen in Tallinn, Estonia, the 8th July, as a satellite event of the FSCD conference. The workshop received 8 submissions, of which 6 were presented at the workshop. Of these, 2 were work-in-progress presentations, and 4 were accepted for these formal proceedings. Additionally, Carsten Sch\"urmann of IT University of Copenhagen gave an invited talk on Nominal State Separating Proofs.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":"114 33","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141668249","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The"Harmony Lemma", as formulated by Sangiorgi&Walker, establishes the equivalence between the labelled transition semantics and the reduction semantics in the $pi$-calculus. Despite being a widely known and accepted result for the standard $pi$-calculus, this assertion has never been rigorously proven, formally or informally. Hence, its validity may not be immediately apparent when considering extensions of the $pi$-calculus. Contributing to the second challenge of the Concurrent Calculi Formalization Benchmark -- a set of challenges tackling the main issues related to the mechanization of concurrent systems -- we present a formalization of this result for the fragment of the $pi$-calculus examined in the Benchmark. Our formalization is implemented in Beluga and draws inspiration from the HOAS formalization of the LTS semantics popularized by Honsell et al. In passing, we introduce a couple of useful encoding techniques for handling telescopes and lexicographic induction.
{"title":"A Beluga Formalization of the Harmony Lemma in the π-Calculus","authors":"Gabriele Cecilia, A. Momigliano","doi":"10.4204/EPTCS.404.1","DOIUrl":"https://doi.org/10.4204/EPTCS.404.1","url":null,"abstract":"The\"Harmony Lemma\", as formulated by Sangiorgi&Walker, establishes the equivalence between the labelled transition semantics and the reduction semantics in the $pi$-calculus. Despite being a widely known and accepted result for the standard $pi$-calculus, this assertion has never been rigorously proven, formally or informally. Hence, its validity may not be immediately apparent when considering extensions of the $pi$-calculus. Contributing to the second challenge of the Concurrent Calculi Formalization Benchmark -- a set of challenges tackling the main issues related to the mechanization of concurrent systems -- we present a formalization of this result for the fragment of the $pi$-calculus examined in the Benchmark. Our formalization is implemented in Beluga and draws inspiration from the HOAS formalization of the LTS semantics popularized by Honsell et al. In passing, we introduce a couple of useful encoding techniques for handling telescopes and lexicographic induction.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":" 373","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141669334","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Parametricity allows the transfer of proofs between different implementations of the same data structure. The lambdaPi-calculus modulo theory is an extension of the lambda-calculus with dependent types and user-defined rewrite rules. It is a logical framework, used to exchange proofs between different proof systems. We define an interpretation of theories of the lambdaPi-calculus modulo theory, inspired by parametricity. Such an interpretation allows to transfer proofs for free between theories that feature the notions of proposition and proof, when the source theory can be embedded into the target theory.
{"title":"Proofs for Free in the λΠ-Calculus Modulo Theory","authors":"Thomas Traversi'e","doi":"10.4204/EPTCS.404.4","DOIUrl":"https://doi.org/10.4204/EPTCS.404.4","url":null,"abstract":"Parametricity allows the transfer of proofs between different implementations of the same data structure. The lambdaPi-calculus modulo theory is an extension of the lambda-calculus with dependent types and user-defined rewrite rules. It is a logical framework, used to exchange proofs between different proof systems. We define an interpretation of theories of the lambdaPi-calculus modulo theory, inspired by parametricity. Such an interpretation allows to transfer proofs for free between theories that feature the notions of proposition and proof, when the source theory can be embedded into the target theory.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":"117 13","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141666959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
When reasoning about formal objects whose structures involve binding, it is often necessary to analyze expressions relative to a context that associates types, values, and other related attributes with variables that appear free in the expressions. We refer to such associations as binding contexts. Reasoning tasks also require properties such as the shape and uniqueness of associations concerning binding contexts to be made explicit. The Abella proof assistant, which supports a higher-order treatment of syntactic constructs, provides a simple and elegant way to describe such contexts from which their properties can be extracted. This mechanism is based at the outset on viewing binding contexts as ordered sequences of associations. However, when dealing with object systems that embody notions of linearity, it becomes necessary to treat binding contexts more generally as partitionable multisets. We show how to adapt the original Abella encoding to encompass such a generalization. The key idea in this adaptation is to base the definition of a binding context on a mapping to an underlying ordered sequence of associations. We further show that properties that hold with the ordered sequence view can be lifted to the generalized definition of binding contexts and that this lifting can, in fact, be automated. These ideas find use in the extension currently under development of the two-level logic approach of Abella to a setting where linear logic is used as the specification logic.
{"title":"Binding Contexts as Partitionable Multisets in Abella","authors":"Terrance Gray, G. Nadathur","doi":"10.4204/EPTCS.404.2","DOIUrl":"https://doi.org/10.4204/EPTCS.404.2","url":null,"abstract":"When reasoning about formal objects whose structures involve binding, it is often necessary to analyze expressions relative to a context that associates types, values, and other related attributes with variables that appear free in the expressions. We refer to such associations as binding contexts. Reasoning tasks also require properties such as the shape and uniqueness of associations concerning binding contexts to be made explicit. The Abella proof assistant, which supports a higher-order treatment of syntactic constructs, provides a simple and elegant way to describe such contexts from which their properties can be extracted. This mechanism is based at the outset on viewing binding contexts as ordered sequences of associations. However, when dealing with object systems that embody notions of linearity, it becomes necessary to treat binding contexts more generally as partitionable multisets. We show how to adapt the original Abella encoding to encompass such a generalization. The key idea in this adaptation is to base the definition of a binding context on a mapping to an underlying ordered sequence of associations. We further show that properties that hold with the ordered sequence view can be lifted to the generalized definition of binding contexts and that this lifting can, in fact, be automated. These ideas find use in the extension currently under development of the two-level logic approach of Abella to a setting where linear logic is used as the specification logic.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":" 7","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141668393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kuroda's translation embeds classical first-order logic into intuitionistic logic, through the insertion of double negations. Recently, Brown and Rizkallah extended this translation to higher-order logic. In this paper, we adapt it for theories encoded in higher-order logic in the lambdaPi-calculus modulo theory, a logical framework that extends lambda-calculus with dependent types and user-defined rewrite rules. We develop a tool that implements Kuroda's translation for proofs written in Dedukti, a proof language based on the lambdaPi-calculus modulo theory.
{"title":"Kuroda's Translation for the λΠ-Calculus Modulo Theory and Dedukti","authors":"Thomas Traversi'e","doi":"10.4204/EPTCS.404.3","DOIUrl":"https://doi.org/10.4204/EPTCS.404.3","url":null,"abstract":"Kuroda's translation embeds classical first-order logic into intuitionistic logic, through the insertion of double negations. Recently, Brown and Rizkallah extended this translation to higher-order logic. In this paper, we adapt it for theories encoded in higher-order logic in the lambdaPi-calculus modulo theory, a logical framework that extends lambda-calculus with dependent types and user-defined rewrite rules. We develop a tool that implements Kuroda's translation for proofs written in Dedukti, a proof language based on the lambdaPi-calculus modulo theory.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":" 370","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141669337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thaynara Arielly de Lima, Andréia B. Avelar, André Luiz Galdino, M. Ayala-Rincón
This paper discusses the extension of the Prototype Verification System (PVS) sub-theory for rings, part of the PVS algebra theory, with theorems related to the division algorithm for Euclidean rings and Unique Factorization Domains that are general structures where an analog of the Fundamental Theorem of Arithmetic holds. First, we formalize the general abstract notions of divisibility, prime, and irreducible elements in commutative rings, essential to deal with unique factorization domains. Then, we formalize the landmark theorem, establishing that every principal ideal domain is a unique factorization domain. Finally, we specify the theory of Euclidean domains and formally verify that the rings of integers, the Gaussian integers, and arbitrary fields are Euclidean domains. To highlight the benefits of such a general abstract discipline of formalization, we specify a Euclidean gcd algorithm for Euclidean domains and formalize its correctness. Also, we show how this correctness is inherited under adequate parameterizations for the structures of integers and Gaussian integers.
{"title":"Formalizing Factorization on Euclidean Domains and Abstract Euclidean Algorithms","authors":"Thaynara Arielly de Lima, Andréia B. Avelar, André Luiz Galdino, M. Ayala-Rincón","doi":"10.4204/EPTCS.402.5","DOIUrl":"https://doi.org/10.4204/EPTCS.402.5","url":null,"abstract":"This paper discusses the extension of the Prototype Verification System (PVS) sub-theory for rings, part of the PVS algebra theory, with theorems related to the division algorithm for Euclidean rings and Unique Factorization Domains that are general structures where an analog of the Fundamental Theorem of Arithmetic holds. First, we formalize the general abstract notions of divisibility, prime, and irreducible elements in commutative rings, essential to deal with unique factorization domains. Then, we formalize the landmark theorem, establishing that every principal ideal domain is a unique factorization domain. Finally, we specify the theory of Euclidean domains and formally verify that the rings of integers, the Gaussian integers, and arbitrary fields are Euclidean domains. To highlight the benefits of such a general abstract discipline of formalization, we specify a Euclidean gcd algorithm for Euclidean domains and formalize its correctness. Also, we show how this correctness is inherited under adequate parameterizations for the structures of integers and Gaussian integers.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":"15 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140673722","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Proof Terms for Higher-Order Rewriting and Their Equivalence","authors":"Pablo Barenbaum","doi":"10.4204/eptcs.402.1","DOIUrl":"https://doi.org/10.4204/eptcs.402.1","url":null,"abstract":"","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":"56 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140675894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Márk Somorjai, Mihály Dobos-Kovács, Zsófia Ádám, Levente Bajczi, Andr'as Voros
Constrained Horn Clauses (CHCs) have conventionally been used as a low-level representation in formal verification. Most existing solvers use a diverse set of specialized techniques, including direct state space traversal or under-approximating abstraction, necessitating purpose-built complex algorithms. Other solvers successfully simplified the verification workflow by translating the problem to inputs for other verification tasks, leveraging the strengths of existing algorithms. One such approach transforms the CHC problem into a recursive program roughly emulating a top-down solver for the deduction task; and verifying the reachability of a safety violation specified as a control location. We propose an alternative bottom-up approach for linear CHCs, and evaluate the two options in the open-source model checking framework THETA on both synthetic and industrial examples. We find that there is a more than twofold increase in the number of solved tasks when the novel bottom-up approach is used in the verification workflow, in contrast with the top-down technique.
{"title":"Bottoms Up for CHCs: Novel Transformation of Linear Constrained Horn Clauses to Software Verification","authors":"Márk Somorjai, Mihály Dobos-Kovács, Zsófia Ádám, Levente Bajczi, Andr'as Voros","doi":"10.4204/EPTCS.402.11","DOIUrl":"https://doi.org/10.4204/EPTCS.402.11","url":null,"abstract":"Constrained Horn Clauses (CHCs) have conventionally been used as a low-level representation in formal verification. Most existing solvers use a diverse set of specialized techniques, including direct state space traversal or under-approximating abstraction, necessitating purpose-built complex algorithms. Other solvers successfully simplified the verification workflow by translating the problem to inputs for other verification tasks, leveraging the strengths of existing algorithms. One such approach transforms the CHC problem into a recursive program roughly emulating a top-down solver for the deduction task; and verifying the reachability of a safety violation specified as a control location. We propose an alternative bottom-up approach for linear CHCs, and evaluate the two options in the open-source model checking framework THETA on both synthetic and industrial examples. We find that there is a more than twofold increase in the number of solved tasks when the novel bottom-up approach is used in the verification workflow, in contrast with the top-down technique.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":"74 25","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140677031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This work concerns the proof theory of (left) skew monoidal categories and their variants (e.g. closed monoidal, symmetric monoidal), continuing the line of work initiated in recent years by Uustalu et al. Skew monoidal categories are a weak version of Mac Lane's monoidal categories, where the structural laws are not required to be invertible, they are merely natural transformations with a specific orientation. Sequent calculi which can be modelled in such categories can be identified as deductive systems for restricted substructural fragments of intuitionistic linear logic. These calculi enjoy cut elimination and admit a focusing strategy, sharing resemblance with Andreoli's normalization technique for linear logic. The focusing procedure is useful for solving the coherence problem of the considered categories with skew structure. Here we investigate possible extensions of the sequent calculi of Uustalu et al. with additive connectives. As a first step, we extend the sequent calculus with additive conjunction and disjunction, corresponding to studying the proof theory of skew monoidal categories with binary products and coproducts satisfying a left-distributivity condition. We introduce a new focused sequent calculus of derivations in normal form, which employs tag annotations to reduce non-deterministic choices in bottom-up proof search. The focused sequent calculus and the proof of its correctness have been formalized in the Agda proof assistant. We also discuss extensions of the logic with additive units, a form of skew exchange and linear implication.
{"title":"Semi-Substructural Logics with Additives","authors":"Niccolò Veltri, Cheng-Syuan Wan","doi":"10.4204/EPTCS.402.8","DOIUrl":"https://doi.org/10.4204/EPTCS.402.8","url":null,"abstract":"This work concerns the proof theory of (left) skew monoidal categories and their variants (e.g. closed monoidal, symmetric monoidal), continuing the line of work initiated in recent years by Uustalu et al. Skew monoidal categories are a weak version of Mac Lane's monoidal categories, where the structural laws are not required to be invertible, they are merely natural transformations with a specific orientation. Sequent calculi which can be modelled in such categories can be identified as deductive systems for restricted substructural fragments of intuitionistic linear logic. These calculi enjoy cut elimination and admit a focusing strategy, sharing resemblance with Andreoli's normalization technique for linear logic. The focusing procedure is useful for solving the coherence problem of the considered categories with skew structure. Here we investigate possible extensions of the sequent calculi of Uustalu et al. with additive connectives. As a first step, we extend the sequent calculus with additive conjunction and disjunction, corresponding to studying the proof theory of skew monoidal categories with binary products and coproducts satisfying a left-distributivity condition. We introduce a new focused sequent calculus of derivations in normal form, which employs tag annotations to reduce non-deterministic choices in bottom-up proof search. The focused sequent calculus and the proof of its correctness have been formalized in the Agda proof assistant. We also discuss extensions of the logic with additive units, a form of skew exchange and linear implication.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":"42 10","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140676357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Slagel, Mariano M. Moscato, Lauren White, César Muñoz, Swee Balachandran, Aaron Dutle
Differential dynamic logic (dL) is a formal framework for specifying and reasoning about hybrid systems, i.e., dynamical systems that exhibit both continuous and discrete behaviors. These kinds of systems arise in many safety- and mission-critical applications. This paper presents a formalization of dL in the Prototype Verification System (PVS) that includes the semantics of hybrid programs and dL's proof calculus. The formalization embeds dL into the PVS logic, resulting in a version of dL whose proof calculus is not only formally verified, but is also available for the verification of hybrid programs within PVS itself. This embedding, called Plaidypvs (Properly Assured Implementation of dL for Hybrid Program Verification and Specification), supports standard dL style proofs, but further leverages the capabilities of PVS to allow reasoning about entire classes of hybrid programs. The embedding also allows the user to import the well-established definitions and mathematical theories available in PVS.
{"title":"Embedding Differential Dynamic Logic in PVS","authors":"J. Slagel, Mariano M. Moscato, Lauren White, César Muñoz, Swee Balachandran, Aaron Dutle","doi":"10.4204/EPTCS.402.7","DOIUrl":"https://doi.org/10.4204/EPTCS.402.7","url":null,"abstract":"Differential dynamic logic (dL) is a formal framework for specifying and reasoning about hybrid systems, i.e., dynamical systems that exhibit both continuous and discrete behaviors. These kinds of systems arise in many safety- and mission-critical applications. This paper presents a formalization of dL in the Prototype Verification System (PVS) that includes the semantics of hybrid programs and dL's proof calculus. The formalization embeds dL into the PVS logic, resulting in a version of dL whose proof calculus is not only formally verified, but is also available for the verification of hybrid programs within PVS itself. This embedding, called Plaidypvs (Properly Assured Implementation of dL for Hybrid Program Verification and Specification), supports standard dL style proofs, but further leverages the capabilities of PVS to allow reasoning about entire classes of hybrid programs. The embedding also allows the user to import the well-established definitions and mathematical theories available in PVS.","PeriodicalId":30085,"journal":{"name":"Electronic Proceedings in Theoretical Computer Science","volume":"33 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140676153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}