Discovering how program components affect one another plays a fundamental role in aiding engineers comprehend and maintain a software system. Despite the fact that the degree to which one program component depends upon another can vary in strength, traditional dependence analysis typically ignores such nuance. To account for this nuance in dependence-based analysis, we propose Causal Program Dependence Analysis (CPDA), a framework based on causal inference that captures the degree (or strength) of the dependence between program elements. For a given program, CPDA intervenes in the program execution to observe changes in value at selected points in the source code. It observes the association between program elements by constructing and executing modified versions of a program (requiring only light-weight parsing rather than sophisticated static analysis). CPDA applies causal inference to the observed changes to identify and estimate the strength of the dependence relations between program elements. We explore the advantages of CPDA's quantified dependence by presenting results for several applications. Our further qualitative evaluation demonstrates 1) that observing different levels of dependence facilitates grouping various functional aspects found in a program and 2) how focusing on the relative strength of the dependences for a particular program element provides a detailed context for that element. Furthermore, a case study that applies CPDA to debugging illustrates how it can improve engineer productivity.
Object Constraint Language (OCL) is one lightweight formal specification. Integrated within the Unified Modeling Language (UML) standard, it serves as a cornerstone in requirements modeling, enjoying widespread adoption across various domains. OCL can precisely define the pre- and post-condition of system operations and system invariants. While OCL provides a simple yet expressive syntax, it lacks clarity in mapping Object-Oriented (OO) concepts, such as object states, object links, and object attributes. This ambiguity makes it challenging for OO developers to identify errors in requirements. In this paper, we propose an approach named OCLVerifier, which can automatically detect the requirements errors of OCL, such as conflict, redundancy, and failure error. OCLVerifier first transforms OO contracts and detection patterns into SMT formulas and then proves them by using a SMT solver. Finally, the results are mapped to the original OCL contracts to display detailed error type and location information. To evaluate OCLVerifier, we conducted a comprehensive evaluation of four case studies. Experimental results indicate that OCLVerifier successfully identifies 65.5% of error cases, with each identified case offering accurate error location information. Compared with human experts, OCLVerifier can reduce evaluation time by 80.8% while enhancing repair accuracy by 18%. The results are satisfactory, and the proposed approach can be further extended to the software industry for requirements verification.
Mi Superpoder es la Programación is a web tool designed to teach programming to children and young people. It focuses on developing logical thinking through interactive exercises that cover computer parts recognition, sequences, patterns, and flowcharts. The tool was developed to address the educational needs identified in the social project of the same name, where modern technologies and a serverless-based architecture were used to create an accessible and effective solution for teaching programming. Initial results indicate that students found the tool useful and demonstrated improvements in their understanding of computational logic. This analysis is framed within the global challenge of teaching programming to children and youth, demonstrating the potential of gamified tools across diverse educational contexts. Future plans include expanding the tool to incorporate more modules, allowing customization by teachers, and conducting broader evaluations in different educational environments.
The need for more flexible and robust models to reason about systems in the presence of conflicting information is becoming more and more relevant in different contexts. This has prompted the introduction of paraconsistent transition systems, where transitions are characterized by two pairs of weights: one representing the evidence that the transition effectively occurs and the other its absence. Such a pair of weights can express scenarios of vagueness and inconsistency. This paper establishes a foundation for a compositional and structured specification approach of paraconsistent transition systems, framed as paraconsistent institution. The proposed methodology follows the stepwise implementation process outlined by Sannella and Tarlecki.
Employing rules for the automatic extraction of conceptual diagrams from software requirements has been in practice for some time. However, considering only rules for extraction makes the system complex to handle. Moreover, the rules are predominantly based on the syntactic structure such as Part of Speech tags along with Dependency Grammar of sentences and rarely on semantics. In this paper, we propose to use a probabilistic approach in configuration with the rule-based technique and the Word embeddings to preserve the semantics of the sentence. Hence, reduces the complexity of the extraction procedure. Further, we advocate the use of a divide-and-conquer policy of extraction instead of extracting classes for one entire use case description. We extract the class diagram from small use cases and then merge it to obtain the class diagram. As generated class diagram corresponding to small use cases can be utilized in another similar software design, thus, it increases the scalability and decreases the extraction time. The proposed hybrid approach integrates the knowledge from the experiences. Thus, the proposed approach achieved 90% as F1-score whereas the F1-Score for the existing methods ranged between 79-88%. The proposed hybrid approach also shows a 19.44% reduction in terms of the number of iterations performed to carry out extraction procedures for individual use cases. Hence, reduces the extraction procedure complexity.
Web3.0 Decentralized Application (DApp) is a class of decentralized software in which at least the business logic of the software is implemented using blockchain-based smart contracts. Features such as transparency, decentralized execution environment, no need for a central authority, immutability of data from manipulation, as well as a native transaction-based payment system based on cryptographic tokens are the main advantages of Web3.0 DApps over conventional Web2.0 software in which the business logic and user data are centrally controlled by companies with no transparency. However, the development lifecycle of Web3.0 DApps involves many challenges due to the complexity of blockchain technology and smart contracts as well as the difficulties concerning with the integration of DApp on-chain and off-chain components. To alleviate these challenges, a Model Driven Architecture (MDA) approach for the development of Web3.0 DApps is proposed in this paper that streamlines the development of complex multi-lateral DApps and results in a product that is verifiable, traceable, low-cost, maintainable, less error-prone and in conformance with blockchain platform concepts. Opposed to previous studies in this area that applied MDA only for the development of smart contracts, our proposed MDA-based approach covers the full architecture of Web3.0 DApps: on-chain, off-chain and on-chain/off-chain communication patterns. The method application was demonstrated by implementing a land leasing Dapp where the requirement model (a BPMN choreography model) was transformed into CIM, PIM, and PSM instances successively, and finally, the code-base was generated based on the Ethereum platform technology stack. Epsilon Validation Language (EVL), Epsilon Object Language (EOL), and Epsilon Comparison Language (ECL) were used for the verification/validation of the model instances at each step. Furthermore, by evaluating the quality metrics of the proposed meta-models, we show that they have a better ontology coverage and are more reusable and understandable compared to previous meta-models.
Ensuring timely coordination between autonomous aircraft is a challenging problem in decentralized air traffic management (ATM) applications for urban air mobility (UAM) scenarios. This paper presents an approach for formally guaranteeing timely progress in a Two-Phase Acknowledge distributed knowledge propagation protocol by probabilistically modeling the delays using the theory of the Multicopy Two-Hop Relay protocol and the M/M/1 queue system. The guarantee states a probabilistic upper bound to the time for progress as a function of the probabilities of the total transmission and processing delays following two specific distributions. The proof uses a general library of formal theories, that can be used for the rigorous mechanical verification of autonomous aircraft coordination protocols using the Athena proof checker and assistant.
The existence of infeasible paths in a program reduces the coverage of test cases and causes a waste of valuable testing resources. Detecting infeasible paths allows for focusing testing resources on feasible paths. This paper introduces a method for detecting infeasible paths based on program summaries. Our proposed method partitions the program into sequential statements, conditional statements and loop statements, and automatically generates statement summaries and function summaries. It analyzes the summaries to extract the path constraints and determines the feasibility of paths. We implemented a detection tool named DTSIP based on this method, and conducted experiments using a set of benchmark programs and open source projects. The results confirm the effectiveness of our method in detecting infeasible paths. It can detect both intraprocedural and interprocedural infeasible paths, demonstrating its broad applicability. Our method overcomes challenges associated with analyzing complex paths, achieving efficient feasibility determination while reducing processing time.