Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2017.6
Martin Hirzel, Louis Mandel, Avraham Shinnar, Jérôme Siméon, M. Vaziri
Humans and computers increasingly converse via natural language. Those conversations are moving from today's simple question answering and command-and-control to more complex dialogs. Developers must specify those dialogs. This paper explores how to assist developers in this specification. We map out the staggering variety of applications for human-computer dialogs and distill it into a catalog of flow patterns. Based on that, we articulate the requirements for dialog programming models and offer our vision for satisfying these requirements using grammars. If our approach catches on, computers will soon parse you to better assist you in your daily life.
{"title":"I Can Parse You: Grammars for Dialogs","authors":"Martin Hirzel, Louis Mandel, Avraham Shinnar, Jérôme Siméon, M. Vaziri","doi":"10.4230/LIPIcs.SNAPL.2017.6","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2017.6","url":null,"abstract":"Humans and computers increasingly converse via natural language. Those conversations are moving from today's simple question answering and command-and-control to more complex dialogs. Developers must specify those dialogs. This paper explores how to assist developers in this specification. We map out the staggering variety of applications for human-computer dialogs and distill it into a catalog of flow patterns. Based on that, we articulate the requirements for dialog programming models and offer our vision for satisfying these requirements using grammars. If our approach catches on, computers will soon parse you to better assist you in your daily life.","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122460358","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2017.10
Chandrakana Nandi, A. Caspi, D. Grossman, Zachary Tatlock
We propose a research agenda to investigate programming language techniques for improving affordable, end-user desktop manufacturing processes such as 3D printing. Our goal is to adapt programming languages tools and extend the decades of research in industrial, high-end CAD/CAM in order to help make affordable desktop manufacturing processes more accurate, fast, reliable, and accessible to end-users. We focus on three major areas where 3D printing can benefit from programming language tools: design synthesis, optimizing compilation, and runtime monitoring. We present preliminary results on synthesizing editable CAD models from difficult-to-edit surface meshes, discuss potential new compilation strategies, and propose runtime monitoring techniques. We conclude by discussing additional near-future directions we intend to pursue.
{"title":"Programming Language Tools and Techniques for 3D Printing","authors":"Chandrakana Nandi, A. Caspi, D. Grossman, Zachary Tatlock","doi":"10.4230/LIPIcs.SNAPL.2017.10","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2017.10","url":null,"abstract":"We propose a research agenda to investigate programming language techniques for improving affordable, end-user desktop manufacturing processes such as 3D printing. Our goal is to adapt programming languages tools and extend the decades of research in industrial, high-end CAD/CAM in order to help make affordable desktop manufacturing processes more accurate, fast, reliable, and accessible to end-users. We focus on three major areas where 3D printing can benefit from programming language tools: design synthesis, optimizing compilation, and runtime monitoring. We present preliminary results on synthesizing editable CAD models from difficult-to-edit surface meshes, discuss potential new compilation strategies, and propose runtime monitoring techniques. We conclude by discussing additional near-future directions we intend to pursue.","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117078839","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2019.2
Sheng Chen, J. P. Campora
In this work, we present an unexpected connection between gradual typing and type error debugging. Namely, we illustrate that gradual typing provides a natural way to defer type errors in statically illtyped programs, providing more feedback than traditional approaches to deferring type errors. When evaluating expressions that lead to runtime type errors, the usefulness of the feedback depends on blame tracking, the defacto approach to locating the cause of such runtime type errors. Unfortunately, blame tracking suffers from the bias problem for type error localization in languages with type inference. We illustrate and formalize the bias problem for blame tracking, present ideas for adapting existing type error debugging techniques to combat this bias, and outline further challenges. 2012 ACM Subject Classification Theory of computation → Type structures
{"title":"Blame Tracking and Type Error Debugging","authors":"Sheng Chen, J. P. Campora","doi":"10.4230/LIPIcs.SNAPL.2019.2","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2019.2","url":null,"abstract":"In this work, we present an unexpected connection between gradual typing and type error debugging. Namely, we illustrate that gradual typing provides a natural way to defer type errors in statically illtyped programs, providing more feedback than traditional approaches to deferring type errors. When evaluating expressions that lead to runtime type errors, the usefulness of the feedback depends on blame tracking, the defacto approach to locating the cause of such runtime type errors. Unfortunately, blame tracking suffers from the bias problem for type error localization in languages with type inference. We illustrate and formalize the bias problem for blame tracking, present ideas for adapting existing type error debugging techniques to combat this bias, and outline further challenges. 2012 ACM Subject Classification Theory of computation → Type structures","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121503040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2017.7
I. Kuraj, Armando Solar-Lezama
While sequential programs represent a simple and natural form for expressing functionality, corresponding distributed implementations get considerably more complex. We examine the possibility of using the sequential computation model for programming distributed systems and requirements for making that possible. The benefits of such an approach include easier specification and reasoning about behaviors in the system, as well as a possibility to directly reuse existing techniques for checking correctness and optimization of sequential programs to produce efficient and reliable distributed implementations.
{"title":"Leveraging Sequential Computation for Programming Efficient and Reliable Distributed Systems","authors":"I. Kuraj, Armando Solar-Lezama","doi":"10.4230/LIPIcs.SNAPL.2017.7","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2017.7","url":null,"abstract":"While sequential programs represent a simple and natural form for expressing functionality, corresponding distributed implementations get considerably more complex. We examine the possibility of using the sequential computation model for programming distributed systems and requirements for making that possible. The benefits of such an approach include easier specification and reasoning about behaviors in the system, as well as a possibility to directly reuse existing techniques for checking correctness and optimization of sequential programs to produce efficient and reliable distributed implementations.","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124215117","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2019.9
S. Krishnamurthi, B. Lerner, L. Elberty
Modern systems consist of large numbers of languages, frameworks, libraries, APIs
现代系统由大量的语言、框架、库和api组成
{"title":"The Next 700 Semantics: A Research Challenge","authors":"S. Krishnamurthi, B. Lerner, L. Elberty","doi":"10.4230/LIPIcs.SNAPL.2019.9","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2019.9","url":null,"abstract":"Modern systems consist of large numbers of languages, frameworks, libraries, APIs","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115878310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2017.8
Brandon Lucia, Vignesh Balaji, A. Colin, Kiwan Maeng, E. Ruppel
The maturation of energy-harvesting technology and ultra-low-power computer systems has led to the advent of intermittently-powered, batteryless devices that operate entirely using energy extracted from their environment. Intermittently operating devices present a rich vein of programming languages research challenges and the purpose of this paper is to illustrate these challenges to the PL research community. To provide depth, this paper includes a survey of the hardware and software design space of intermittent computing platforms. On the foundation of these research challenges and the state of the art in intermittent hardware and software, this paper describes several future PL research directions, emphasizing a connection between intermittence, distributed computing, energy-aware programming and compilation, and approximate computing. We illustrate these connections with a discussion of our ongoing work on programming for intermittence, and on building and simulating intermittent distributed systems.
{"title":"Intermittent Computing: Challenges and Opportunities","authors":"Brandon Lucia, Vignesh Balaji, A. Colin, Kiwan Maeng, E. Ruppel","doi":"10.4230/LIPIcs.SNAPL.2017.8","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2017.8","url":null,"abstract":"The maturation of energy-harvesting technology and ultra-low-power computer systems has led to the advent of intermittently-powered, batteryless devices that operate entirely using energy extracted from their environment. Intermittently operating devices present a rich vein of programming languages research challenges and the purpose of this paper is to illustrate these challenges to the PL research community. To provide depth, this paper includes a survey of the hardware and software design space of intermittent computing platforms. On the foundation of these research challenges and the state of the art in intermittent hardware and software, this paper describes several future PL research directions, emphasizing a connection between intermittence, distributed computing, energy-aware programming and compilation, and approximate computing. We illustrate these connections with a discussion of our ongoing work on programming for intermittence, and on building and simulating intermittent distributed systems.","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131948601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2015.294
F. Steimann
We observe that compared to natural and modelling languages, the differences in expression required to deal with no, one, or many objects in programming languages are particularly pronounced. We identify some problems inherent in type-based unifications of different numbers, and advocate a solution that builds on the introduction of multiplicity as a new grammatical category of programming languages.
{"title":"None, One, Many - What's the Difference, Anyhow?","authors":"F. Steimann","doi":"10.4230/LIPIcs.SNAPL.2015.294","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2015.294","url":null,"abstract":"We observe that compared to natural and modelling languages, the differences in expression required to deal with no, one, or many objects in programming languages are particularly pronounced. We identify some problems inherent in type-based unifications of different numbers, and advocate a solution that builds on the introduction of multiplicity as a new grammatical category of programming languages.","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123176946","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2015.41
Pavol Bielik, Veselin Raychev, Martin T. Vechev
Programming tools based on probabilistic models of massive codebases (aka "Big Code") promise to solve important programming tasks that were difficult or practically infeasible to address before. However, building such tools requires solving a number of hard problems at the intersection of programming languages, program analysis and machine learning. In this paper we summarize some of our experiences and insights obtained by developing several such probabilistic systems over the last few years (some of these systems are regularly used by thousands of developers worldwide). We hope these observations can provide a guideline for others attempting to create such systems. We also present a prediction approach we find suitable as a starting point for building probabilistic tools, and discuss a practical framework implementing this approach, called Nice2Predict. We release the Nice2Predict framework publicly - the framework can be immediately used as a basis for developing new probabilistic tools. Finally, we present programming applications that we believe will benefit from probabilistic models and should be investigated further.
{"title":"Programming with \"Big Code\": Lessons, Techniques and Applications","authors":"Pavol Bielik, Veselin Raychev, Martin T. Vechev","doi":"10.4230/LIPIcs.SNAPL.2015.41","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2015.41","url":null,"abstract":"Programming tools based on probabilistic models of massive codebases (aka \"Big Code\") promise to solve important programming tasks that were difficult or practically infeasible to address before. However, building such tools requires solving a number of hard problems at the intersection of programming languages, program analysis and machine learning. \u0000 \u0000In this paper we summarize some of our experiences and insights obtained by developing several such probabilistic systems over the last few years (some of these systems are regularly used by thousands of developers worldwide). We hope these observations can provide a guideline for others attempting to create such systems. \u0000 \u0000We also present a prediction approach we find suitable as a starting point for building probabilistic tools, and discuss a practical framework implementing this approach, called Nice2Predict. We release the Nice2Predict framework publicly - the framework can be immediately used as a basis for developing new probabilistic tools. Finally, we present programming applications that we believe will benefit from probabilistic models and should be investigated further.","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116648736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2019.6
M. Greenberg
We can tease apart the research on gradual types into two `lineages': a pragmatic, implementation-oriented dynamic-first lineage and a formal, type-theoretic, static-first lineage. The dynamic-first lineage's focus is on taming particular idioms - `pre-existing conditions' in untyped programming languages. The static-first lineage's focus is on interoperation and individual type system features, rather than the collection of features found in any particular language. Both appear in programming languages research under the name "gradual typing", and they are in active conversation with each other. What are these two lineages? What challenges and opportunities await the static-first lineage? What progress has been made so far?
{"title":"The Dynamic Practice and Static Theory of Gradual Typing","authors":"M. Greenberg","doi":"10.4230/LIPIcs.SNAPL.2019.6","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2019.6","url":null,"abstract":"We can tease apart the research on gradual types into two `lineages': a pragmatic, implementation-oriented dynamic-first lineage and a formal, type-theoretic, static-first lineage. The dynamic-first lineage's focus is on taming particular idioms - `pre-existing conditions' in untyped programming languages. The static-first lineage's focus is on interoperation and individual type system features, rather than the collection of features found in any particular language. Both appear in programming languages research under the name \"gradual typing\", and they are in active conversation with each other. \u0000What are these two lineages? What challenges and opportunities await the static-first lineage? What progress has been made so far?","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121985367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.4230/LIPIcs.SNAPL.2019.10
L. Kuper, P. Alvaro
To guard against machine failures, modern internet services store multiple replicas of the same application data within and across data centers, which introduces the problem of keeping geo-distributed replicas consistent with one another in the face of network partitions and unpredictable message latency. To avoid costly and conservative synchronization protocols, many real-world systems provide only weak consistency guarantees (e.g., eventual, causal, or PRAM consistency), which permit certain kinds of disagreement among replicas. There has been much recent interest in language support for specifying and verifying such consistency properties. Although these properties are usually beyond the scope of what traditional type checkers or compiler analyses can guarantee, solver-aided languages are up to the task. Inspired by systems like Liquid Haskell [43] and Rosette [42], we believe that close integration between a language and a solver is the right path to consistent-by-construction distributed applications. Unfortunately, verifying distributed consistency properties requires reasoning about transitive relations (e.g., causality or happens-before), partial orders (e.g., the lattice of replica states under a convergent merge operation), and properties relevant to message processing or API invocation (e.g., commutativity and idempotence) that cannot be easily or efficiently carried out by general-purpose SMT solvers that lack native support for this kind of reasoning. We argue that domain-specific SMT-based tools that exploit the mathematical foundations of distributed consistency would enable both more efficient verification and improved ease of use for domain experts. The principle of exploiting domain knowledge for efficiency and expressivity that has borne fruit elsewhere – such as in the development of high-performance domain-specific languages that trade off generality to gain both performance and productivity – also applies here. Languages augmented with domain-specific, consistency-aware solvers would support the rapid implementation of formally verified programming abstractions that guarantee distributed consistency. In the long run, we aim to democratize the development of such domain-specific solvers by creating a framework for domain-specific solver development that brings new theory solver implementation within the reach of programmers who are not necessarily SMT solver internals experts
{"title":"Toward Domain-Specific Solvers for Distributed Consistency","authors":"L. Kuper, P. Alvaro","doi":"10.4230/LIPIcs.SNAPL.2019.10","DOIUrl":"https://doi.org/10.4230/LIPIcs.SNAPL.2019.10","url":null,"abstract":"To guard against machine failures, modern internet services store multiple replicas of the same application data within and across data centers, which introduces the problem of keeping geo-distributed replicas consistent with one another in the face of network partitions and unpredictable message latency. To avoid costly and conservative synchronization protocols, many real-world systems provide only weak consistency guarantees (e.g., eventual, causal, or PRAM consistency), which permit certain kinds of disagreement among replicas. There has been much recent interest in language support for specifying and verifying such consistency properties. Although these properties are usually beyond the scope of what traditional type checkers or compiler analyses can guarantee, solver-aided languages are up to the task. Inspired by systems like Liquid Haskell [43] and Rosette [42], we believe that close integration between a language and a solver is the right path to consistent-by-construction distributed applications. Unfortunately, verifying distributed consistency properties requires reasoning about transitive relations (e.g., causality or happens-before), partial orders (e.g., the lattice of replica states under a convergent merge operation), and properties relevant to message processing or API invocation (e.g., commutativity and idempotence) that cannot be easily or efficiently carried out by general-purpose SMT solvers that lack native support for this kind of reasoning. We argue that domain-specific SMT-based tools that exploit the mathematical foundations of distributed consistency would enable both more efficient verification and improved ease of use for domain experts. The principle of exploiting domain knowledge for efficiency and expressivity that has borne fruit elsewhere – such as in the development of high-performance domain-specific languages that trade off generality to gain both performance and productivity – also applies here. Languages augmented with domain-specific, consistency-aware solvers would support the rapid implementation of formally verified programming abstractions that guarantee distributed consistency. In the long run, we aim to democratize the development of such domain-specific solvers by creating a framework for domain-specific solver development that brings new theory solver implementation within the reach of programmers who are not necessarily SMT solver internals experts","PeriodicalId":231548,"journal":{"name":"Summit on Advances in Programming Languages","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128003007","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}