This paper proposes decentralized disaster recovery networks using beacon stuffing for simplifying the network establishment (DRN). It is designed to leverage wireless connections of mobile devices in disaster areas without additional hardware such as wireless network interface cards (WNICs). The simulation results show the feasibility of the proposed method for disaster recovery .
{"title":"Poster: Decentralized Disaster Recovery Networks Using Beacon Stuffing","authors":"T. D. Nguyen, Q. Minh, Vu Pham Tran","doi":"10.1145/2938559.2948774","DOIUrl":"https://doi.org/10.1145/2938559.2948774","url":null,"abstract":"This paper proposes decentralized disaster recovery networks using beacon stuffing for simplifying the network establishment (DRN). It is designed to leverage wireless connections of mobile devices in disaster areas without additional hardware such as wireless network interface cards (WNICs). The simulation results show the feasibility of the proposed method for disaster recovery .","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122485306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
To model the overall personal inhalation of hazardous gases through the air (both indoor and outdoor) by an individual, provide air quality friendly route recommendations, thus raising the overall quality of urban movement and living healthy life.
{"title":"Poster: Air Quality Friendly Route Recommendation System","authors":"Savina Singla, D. Bansal, Archan Misra","doi":"10.1145/2938559.2948780","DOIUrl":"https://doi.org/10.1145/2938559.2948780","url":null,"abstract":"To model the overall personal inhalation of hazardous gases through the air (both indoor and outdoor) by an individual, provide air quality friendly route recommendations, thus raising the overall quality of urban movement and living healthy life.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124412597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Under water wireless sensing (UWSN) system is more challenging due to the replacement difficulty of underwater sensors. A mobile target tracking protocol in UWSNs must have energy saving strategy. In our research, we propose local search based energy saving tracking method for an UWSN where we use a meta-heuristic based algorithm by keeping the minimum number of sensors active which ultimately increases the network lifetime. We validate our method by experimental results.
{"title":"Poster: ETRACK: Energy Efficient Tracking a Mobile Object Using Under Water Sensors","authors":"Nazia Majadi, Mahmuda Naznin, Toufique Ahmed","doi":"10.1145/2938559.2948845","DOIUrl":"https://doi.org/10.1145/2938559.2948845","url":null,"abstract":"Under water wireless sensing (UWSN) system is more challenging due to the replacement difficulty of underwater sensors. A mobile target tracking protocol in UWSNs must have energy saving strategy. In our research, we propose local search based energy saving tracking method for an UWSN where we use a meta-heuristic based algorithm by keeping the minimum number of sensors active which ultimately increases the network lifetime. We validate our method by experimental results.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126598211","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Way finding abilities of vision impaired (VI) people cannot be expected to be as same as that of people who use the vision as the primary sense so the route determination methods in travel aids for VI people needs to adopted accordingly. This study proposes an indoor route determination model which considers the built environment and user context as the priority factors instead of the distance. It will identify how the elements of a built environment affect the determination of a suitable path which maximizes safety and convenience in an unfamiliar environment and how this path does varies with the individual characteristics of VI people.
{"title":"Poster: Context Aware Route Determination Model for Mobile Indoor Navigation Systems for Vision Impaired People","authors":"N. Fernando, D. McMeekin, I. Murray","doi":"10.1145/2938559.2948799","DOIUrl":"https://doi.org/10.1145/2938559.2948799","url":null,"abstract":"Way finding abilities of vision impaired (VI) people cannot be expected to be as same as that of people who use the vision as the primary sense so the route determination methods in travel aids for VI people needs to adopted accordingly. This study proposes an indoor route determination model which considers the built environment and user context as the priority factors instead of the distance. It will identify how the elements of a built environment affect the determination of a suitable path which maximizes safety and convenience in an unfamiliar environment and how this path does varies with the individual characteristics of VI people.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125971084","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Built-in cameras of mobile and wearable devices enable a variety of applications such as augmented reality, continuous sensing, and life-logging systems, which bring joy and convenience to human lives. However, being recorded by unauthorized or unnoticed cameras have raised people's concerns about visual privacy. To address this problem, we propose a novel interactive method to control visual privacy. We allow individuals to interact with cameras using static tags and more flexible hand gestures for privacy control. By delivering privacy control messages via visual indicators, devices will automatically perform control operations according to privacy indicators detected and control rules.
{"title":"Demo: Interactive Visual Privacy Control with Gestures","authors":"Jiayu Shu, Rui Zheng, P. Hui","doi":"10.1145/2938559.2938571","DOIUrl":"https://doi.org/10.1145/2938559.2938571","url":null,"abstract":"Built-in cameras of mobile and wearable devices enable a variety of applications such as augmented reality, continuous sensing, and life-logging systems, which bring joy and convenience to human lives. However, being recorded by unauthorized or unnoticed cameras have raised people's concerns about visual privacy. To address this problem, we propose a novel interactive method to control visual privacy. We allow individuals to interact with cameras using static tags and more flexible hand gestures for privacy control. By delivering privacy control messages via visual indicators, devices will automatically perform control operations according to privacy indicators detected and control rules.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"149 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127025721","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Current operating systems are already proficient at managing certain system resources, such as the CPU, memory, and disk. But on interactive mobile devices, users care more about resources such as time, battery life, and money, that are unmanaged or poorly managed by today's smartphone platforms. It is the degree to which mobile devices effectively manage these human-facing resources that determines a user's quality of experience (QoE), and it is QoE which should drive not just policy, but decisions on mobile devices. To meet smartphone users' expectations, it is necessary to design systems that can accurately measure and understand QoE, and make decisions based on QoE.
{"title":"Poster: QoE-centric Mobile Operating System Design","authors":"Scott Haseley, Geoffrey Challen","doi":"10.1145/2938559.2938604","DOIUrl":"https://doi.org/10.1145/2938559.2938604","url":null,"abstract":"Current operating systems are already proficient at managing certain system resources, such as the CPU, memory, and disk. But on interactive mobile devices, users care more about resources such as time, battery life, and money, that are unmanaged or poorly managed by today's smartphone platforms. It is the degree to which mobile devices effectively manage these human-facing resources that determines a user's quality of experience (QoE), and it is QoE which should drive not just policy, but decisions on mobile devices. To meet smartphone users' expectations, it is necessary to design systems that can accurately measure and understand QoE, and make decisions based on QoE.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130590987","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kasthuri Jayarajah, Zaman Lantra, Ritesh Kumar, Archan Misra
Understanding one's group context in indoor spaces is useful for many reasons -- e.g., at a shopping mall, knowing a customer's group context can help in offering context-specific incentives, or estimating taxi demand for customers exiting the mall. We presented GruMon in Sen et. al, which detects groups accurately under accurate localization assumptions or with the availability of inertial sensors from smartphones carried by the users. However, in most real situations, (1) client-side sensory information is not available, and (2) server-side localization is erroneous. Further, detecting people who not carry mobile phones with them (e.g., children/elders and phones with WiFi turned OFF, which we refer to as ``hidden nodes"), and separating out smaller sub-groups from a larger group that happens to share similar trajectories, remains a challenge. In this demo paper, we present our improved system for group detection in indoor environments. We overcome the key challenges of localization errors, presence of hidden nodes and sub-groups by fusing the WiFi and video sensing modalities.
{"title":"Demo: Fusing WiFi and Video Sensing for Accurate Group Detection in Indoor Spaces","authors":"Kasthuri Jayarajah, Zaman Lantra, Ritesh Kumar, Archan Misra","doi":"10.1145/2938559.2938582","DOIUrl":"https://doi.org/10.1145/2938559.2938582","url":null,"abstract":"Understanding one's group context in indoor spaces is useful for many reasons -- e.g., at a shopping mall, knowing a customer's group context can help in offering context-specific incentives, or estimating taxi demand for customers exiting the mall. We presented GruMon in Sen et. al, which detects groups accurately under accurate localization assumptions or with the availability of inertial sensors from smartphones carried by the users. However, in most real situations, (1) client-side sensory information is not available, and (2) server-side localization is erroneous. Further, detecting people who not carry mobile phones with them (e.g., children/elders and phones with WiFi turned OFF, which we refer to as ``hidden nodes\"), and separating out smaller sub-groups from a larger group that happens to share similar trajectories, remains a challenge. In this demo paper, we present our improved system for group detection in indoor environments. We overcome the key challenges of localization errors, presence of hidden nodes and sub-groups by fusing the WiFi and video sensing modalities.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131842225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A Traffic Sign Board (TSB) provides advance information about road conditions ahead, orders, warning, or guidance to the road users. The traditional TSBs use sign boards as a signal for vehicles and expect the drivers to be aware the sign board. In some scenarios, such as rains, traffic jam, light intensity, and vehicle speed, reduce the visibility of the TSBs. In this work, we proposed a smart traffic sign board system using Bluetooth Low Energy (BLE). The system opportunistically advertises the TSB information to road users (e.g., drivers, passengers, and riders) using BLE.
{"title":"Poster: A Step Towards Smart Traffic Sign Board by Smart Devices","authors":"S. Gautam, Hari Prabhat Gupta, Tanima Dutta","doi":"10.1145/2938559.2948856","DOIUrl":"https://doi.org/10.1145/2938559.2948856","url":null,"abstract":"A Traffic Sign Board (TSB) provides advance information about road conditions ahead, orders, warning, or guidance to the road users. The traditional TSBs use sign boards as a signal for vehicles and expect the drivers to be aware the sign board. In some scenarios, such as rains, traffic jam, light intensity, and vehicle speed, reduce the visibility of the TSBs. In this work, we proposed a smart traffic sign board system using Bluetooth Low Energy (BLE). The system opportunistically advertises the TSB information to road users (e.g., drivers, passengers, and riders) using BLE.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133571890","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nisarg Raval, Animesh Srivastava, Ali Razeen, Kiron Lebeck, Ashwin Machanavajjhala, Landon P. Cox
The proliferation of camera-equipped computers creates a dilemma. On one hand, cameras enable many useful applications, including video chat, document scanning, and QR-code reading. At the same time, sensitive information in the physical environment can inadvertently leak through image data shared with applications. Preventing leaks by a determined attacker is likely impossible, but as camera-based applications become more central to our work and personal lives, it is imperative to develop tools that provide greater control over what information applications can access through cameras. We have designed and implemented two systems, PrivateEye and WaveOff, that use privacy markers to provide fine-grained access control of information within a camera's view. Our demonstration highlights the accuracy, performance, and usability of these systems.
{"title":"Demo: What You Mark is What Apps See","authors":"Nisarg Raval, Animesh Srivastava, Ali Razeen, Kiron Lebeck, Ashwin Machanavajjhala, Landon P. Cox","doi":"10.1145/2938559.2938579","DOIUrl":"https://doi.org/10.1145/2938559.2938579","url":null,"abstract":"The proliferation of camera-equipped computers creates a dilemma. On one hand, cameras enable many useful applications, including video chat, document scanning, and QR-code reading. At the same time, sensitive information in the physical environment can inadvertently leak through image data shared with applications. Preventing leaks by a determined attacker is likely impossible, but as camera-based applications become more central to our work and personal lives, it is imperative to develop tools that provide greater control over what information applications can access through cameras. We have designed and implemented two systems, PrivateEye and WaveOff, that use privacy markers to provide fine-grained access control of information within a camera's view. Our demonstration highlights the accuracy, performance, and usability of these systems.","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131191393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sangeun Oh, Hyuck Yoo, Dae R. Jeong, Sooyoung Park, D. H. Bui, S. Moon, I. Shin
{"title":"Demo: Mobile Plus: Mobile Platform for Transparent Sharing of Functionalities Across Devices","authors":"Sangeun Oh, Hyuck Yoo, Dae R. Jeong, Sooyoung Park, D. H. Bui, S. Moon, I. Shin","doi":"10.1145/2938559.2938568","DOIUrl":"https://doi.org/10.1145/2938559.2938568","url":null,"abstract":"","PeriodicalId":298684,"journal":{"name":"MobiSys '16 Companion","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131670014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}