{"title":"Applications of multimodal physical (IoT), cyber and social data for reliable and actionable insights","authors":"A. Sheth, Pramod Anantharam, K. Thirunarayan","doi":"10.4108/ICST.COLLABORATECOM.2014.257553","DOIUrl":null,"url":null,"abstract":"Physical objects with embedded sensors are increasingly being networked together using wireless and internet technologies to form Internet of Things (IoT). However, early applications that rely on IoT data fail to provide comprehensive situational awareness. This often requires combining physical (i.e., IoT) data with social data created by humans on the Web and increasingly on their mobile phones (i.e., citizen sensing) as well as other data such as structured open data and background knowledge available on the Web (i.e., cyber data and knowledge). In this paper, we explore how integration and analysis of multimodal physical-cybersocial data can support advanced applications and enrich human experience. Specifically, we illustrate the complementary role played by sensor and social data, often intermediated by other Web based data and knowledge, using real-world examples in the domain of situational awareness, traffic monitoring, and healthcare. We also show how semantic techniques and technologies support critical data interoperability needs, advanced computation capabilities including reasoning, and significantly enhance our ability to exploit growing amount of data from the proliferation of Internet of Things.","PeriodicalId":432345,"journal":{"name":"10th IEEE International Conference on Collaborative Computing: Networking, Applications and Worksharing","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"10th IEEE International Conference on Collaborative Computing: Networking, Applications and Worksharing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4108/ICST.COLLABORATECOM.2014.257553","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Physical objects with embedded sensors are increasingly being networked together using wireless and internet technologies to form Internet of Things (IoT). However, early applications that rely on IoT data fail to provide comprehensive situational awareness. This often requires combining physical (i.e., IoT) data with social data created by humans on the Web and increasingly on their mobile phones (i.e., citizen sensing) as well as other data such as structured open data and background knowledge available on the Web (i.e., cyber data and knowledge). In this paper, we explore how integration and analysis of multimodal physical-cybersocial data can support advanced applications and enrich human experience. Specifically, we illustrate the complementary role played by sensor and social data, often intermediated by other Web based data and knowledge, using real-world examples in the domain of situational awareness, traffic monitoring, and healthcare. We also show how semantic techniques and technologies support critical data interoperability needs, advanced computation capabilities including reasoning, and significantly enhance our ability to exploit growing amount of data from the proliferation of Internet of Things.