{"title":"New Challenges and Opportunities in Stream Processing: Transactions, Predictive Analytics, and Beyond: (Invited Keynote)","authors":"Nesime Tatbul","doi":"10.1145/3210284.3214706","DOIUrl":null,"url":null,"abstract":"EXTENDED ABSTRACT Stream processing has been an area of ongoing research since the early 2000s. Fueled by industry’s growing interest in dealing with high-velocity big data in near real-time settings, there has been a resurgence of recent activity in both research and engineering of large-scale stream processing systems. In this talk, we will examine the state of the art, focusing in particular on key trends of the past five years with an outlook towards the next five years. I will also give examples from our own work, including stream processing in transactional settings as well as predictive time series analytics for the Internet of Things. Transactional stream processing broadly refers to processing streaming data with correctness guarantees. These guarantees include not only properties that are intrinsic to stream processing (e.g., order, exactly-once semantics), but also ACID properties of traditional OLTP-oriented databases, which arise in streaming applications with shared mutable state. In our recent work, we have designed and built the S-Store System, a scalable main-memory system that supports hybrid OLTP+streaming workloads with strict correctness needs [5]. A use case that best exemplifies the strengths of S-Store is real-time data ingestion [4]. Thus, I will also discuss the requirements of modern data ingestion and how to meet them using S-Store, especially within the context of our BigDAWG Polystore System [1, 6].","PeriodicalId":412438,"journal":{"name":"Proceedings of the 12th ACM International Conference on Distributed and Event-based Systems","volume":"473 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 12th ACM International Conference on Distributed and Event-based Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3210284.3214706","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
EXTENDED ABSTRACT Stream processing has been an area of ongoing research since the early 2000s. Fueled by industry’s growing interest in dealing with high-velocity big data in near real-time settings, there has been a resurgence of recent activity in both research and engineering of large-scale stream processing systems. In this talk, we will examine the state of the art, focusing in particular on key trends of the past five years with an outlook towards the next five years. I will also give examples from our own work, including stream processing in transactional settings as well as predictive time series analytics for the Internet of Things. Transactional stream processing broadly refers to processing streaming data with correctness guarantees. These guarantees include not only properties that are intrinsic to stream processing (e.g., order, exactly-once semantics), but also ACID properties of traditional OLTP-oriented databases, which arise in streaming applications with shared mutable state. In our recent work, we have designed and built the S-Store System, a scalable main-memory system that supports hybrid OLTP+streaming workloads with strict correctness needs [5]. A use case that best exemplifies the strengths of S-Store is real-time data ingestion [4]. Thus, I will also discuss the requirements of modern data ingestion and how to meet them using S-Store, especially within the context of our BigDAWG Polystore System [1, 6].