Marian Haescher, Denys J. C. Matthies, John Trimpop, B. Urban
Since the human body is a living organism, it emits various life signs which can be traced with an action potential sensitive electromyography, but also with motion sensitive sensors such as typical inertial sensors. In this paper, we present a possibility to recognize the heart rate (HR), respiration rate (RR), and the muscular microvibrations (MV) by an accelerometer worn on the wrist. We compare our seismocardiography (SCG) / ballistocardiography (BCG) approach to commonly used measuring methods. In conclusion, our study confirmed that SCG/BCD with a wrist-worn accelerometer also provides accurate vital parameters. While the recognized RR deviated slightly from the ground truth (SD=16.61%), the detection of HR is non-significantly different (SD=1.63%) to the gold standard.
{"title":"A study on measuring heart- and respiration-rate via wrist-worn accelerometer-based seismocardiography (SCG) in comparison to commonly applied technologies","authors":"Marian Haescher, Denys J. C. Matthies, John Trimpop, B. Urban","doi":"10.1145/2790044.2790054","DOIUrl":"https://doi.org/10.1145/2790044.2790054","url":null,"abstract":"Since the human body is a living organism, it emits various life signs which can be traced with an action potential sensitive electromyography, but also with motion sensitive sensors such as typical inertial sensors. In this paper, we present a possibility to recognize the heart rate (HR), respiration rate (RR), and the muscular microvibrations (MV) by an accelerometer worn on the wrist. We compare our seismocardiography (SCG) / ballistocardiography (BCG) approach to commonly used measuring methods. In conclusion, our study confirmed that SCG/BCD with a wrist-worn accelerometer also provides accurate vital parameters. While the recognized RR deviated slightly from the ground truth (SD=16.61%), the detection of HR is non-significantly different (SD=1.63%) to the gold standard.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117115885","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wei Xiang, C. Conly, Christopher D. McMurrough, V. Athitsos
To utilize the full potential of RGB-D devices, calibration must be performed to determine the intrinsic and extrinsic parameters of the color and depth sensors and to reduce lens and depth distortion. After doing so, the depth pixels can be mapped to color pixels and both data streams can be simultaneously utilized. This work presents an overview and quantitative comparison of RGB-D calibration techniques and examines how the resolution and number of images affects calibration.
{"title":"A review and quantitative comparison of methods for kinect calibration","authors":"Wei Xiang, C. Conly, Christopher D. McMurrough, V. Athitsos","doi":"10.1145/2790044.2790056","DOIUrl":"https://doi.org/10.1145/2790044.2790056","url":null,"abstract":"To utilize the full potential of RGB-D devices, calibration must be performed to determine the intrinsic and extrinsic parameters of the color and depth sensors and to reduce lens and depth distortion. After doing so, the depth pixels can be mapped to color pixels and both data streams can be simultaneously utilized. This work presents an overview and quantitative comparison of RGB-D calibration techniques and examines how the resolution and number of images affects calibration.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"175 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124299270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we present a computational state space model to track and analyse activities of workers during manual assembly processes. Those models are well suited to capture the semi-structured processes present in final product assembly tasks. But in contrast to pure activity recognition systems, which map sensor data to executed activities, those models are able to track the context of the user, and to reason about context variables which are not directly observable through sensors. We describe our modelling approach and report on first evaluation results.
{"title":"Computational causal behaviour models for assisted manufacturing","authors":"Sebastian Bader, Frank Krüger, T. Kirste","doi":"10.1145/2790044.2790058","DOIUrl":"https://doi.org/10.1145/2790044.2790058","url":null,"abstract":"In this paper, we present a computational state space model to track and analyse activities of workers during manual assembly processes. Those models are well suited to capture the semi-structured processes present in final product assembly tasks. But in contrast to pure activity recognition systems, which map sensor data to executed activities, those models are able to track the context of the user, and to reason about context variables which are not directly observable through sensors. We describe our modelling approach and report on first evaluation results.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127595789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Many common forms of activities are tactile in their nature. We touch, grasp, and interact with a plethora of objects every day. Some of those objects are registering our activities, such as the millions of touch screens we are using every day. Adding perception to arbitrary objects is an active area of research, with a variety of technologies in use. Acoustic sensors, such as microphones, react to mechanical waves propagating through a medium. By attaching an acoustic sensor to a surface, we can analyze activities on this medium. In this paper, we present signal analysis and machine learning methods that enable us to detect a variety of interaction events on a surface. We extend from previous work, by combining swipe and touch detection in a single method, for the latter achieving an accuracy between 91% and 99% with a single microphone and 97% to 100% with two microphones.
{"title":"Acoustic tracking of hand activities on surfaces","authors":"Andreas Braun, Stefan Krepp, Arjan Kuijper","doi":"10.1145/2790044.2790052","DOIUrl":"https://doi.org/10.1145/2790044.2790052","url":null,"abstract":"Many common forms of activities are tactile in their nature. We touch, grasp, and interact with a plethora of objects every day. Some of those objects are registering our activities, such as the millions of touch screens we are using every day. Adding perception to arbitrary objects is an active area of research, with a variety of technologies in use. Acoustic sensors, such as microphones, react to mechanical waves propagating through a medium. By attaching an acoustic sensor to a surface, we can analyze activities on this medium. In this paper, we present signal analysis and machine learning methods that enable us to detect a variety of interaction events on a surface. We extend from previous work, by combining swipe and touch detection in a single method, for the latter achieving an accuracy between 91% and 99% with a single microphone and 97% to 100% with two microphones.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131069371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Biying Fu, Jakob Karolus, T. Große-Puppendahl, Jonathan Hermann, Arjan Kuijper
Nowadays activity recognition on smartphones is ubiquitously applied, for example to monitor personal health. The smartphone's sensors act as a foundation to provide information on movements, the user's location or direction. Incorporating ultrasound sensing using the smartphone's native speaker and microphone provides additional means for perceiving the environment and humans. In this paper, we outline possible usage scenarios for this new and promising sensing modality. Based on a custom implementation, we provide results on various experiments to assess the opportunities for activity recognition systems. We discuss various limitations and possibilities when wearing the smartphone on the human body. In stationary deployments, e.g. while placed on a night desk, our implementation is able to detect movements in proximities up to 2m as well as discern several gestures performed above the phone.
{"title":"Opportunities for activity recognition using ultrasound doppler sensing on unmodified mobile phones","authors":"Biying Fu, Jakob Karolus, T. Große-Puppendahl, Jonathan Hermann, Arjan Kuijper","doi":"10.1145/2790044.2790046","DOIUrl":"https://doi.org/10.1145/2790044.2790046","url":null,"abstract":"Nowadays activity recognition on smartphones is ubiquitously applied, for example to monitor personal health. The smartphone's sensors act as a foundation to provide information on movements, the user's location or direction. Incorporating ultrasound sensing using the smartphone's native speaker and microphone provides additional means for perceiving the environment and humans. In this paper, we outline possible usage scenarios for this new and promising sensing modality. Based on a custom implementation, we provide results on various experiments to assess the opportunities for activity recognition systems. We discuss various limitations and possibilities when wearing the smartphone on the human body. In stationary deployments, e.g. while placed on a night desk, our implementation is able to detect movements in proximities up to 2m as well as discern several gestures performed above the phone.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133233786","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we will provide a typology of sensor-based activity recognition and interaction, which we call wearable activity recognition. The typology will focus on a conceptual level regarding the relation between persons and computing systems. Two paradigms, first the activity based seamless and obtrusive interaction and second activity-tracking for reflection, are seen as predominant. The conceptual approach will lead to the key term of this technology research, which is currently underexposed in a wider and conceptual understanding: human action/activity. Modeling human action as a topic for human-computer interaction (HCI) in general exists since its beginning. We will apply two classic theories which are influential in the HCI research to the application of wearable activity recognition. It is both a survey and a critical reflection on these concepts. As a further goal of our approach, we argue for the relevance and the benefits this typology can have. Beside practical consequences, a typology of the human-computer relation and the discussion of the key term activity can be a medium for exchange which other disciplines. Especially when applications become more serious, for example in health care, a typology including a wider mutual understanding can be useful for cooperations with non-technical practitioners e.g. doctors or psychologists.
{"title":"A typology of wearable activity recognition and interaction","authors":"Manuel Dietrich, Kristof Van Laerhoven","doi":"10.1145/2790044.2790048","DOIUrl":"https://doi.org/10.1145/2790044.2790048","url":null,"abstract":"In this paper, we will provide a typology of sensor-based activity recognition and interaction, which we call wearable activity recognition. The typology will focus on a conceptual level regarding the relation between persons and computing systems. Two paradigms, first the activity based seamless and obtrusive interaction and second activity-tracking for reflection, are seen as predominant. The conceptual approach will lead to the key term of this technology research, which is currently underexposed in a wider and conceptual understanding: human action/activity. Modeling human action as a topic for human-computer interaction (HCI) in general exists since its beginning. We will apply two classic theories which are influential in the HCI research to the application of wearable activity recognition. It is both a survey and a critical reflection on these concepts. As a further goal of our approach, we argue for the relevance and the benefits this typology can have. Beside practical consequences, a typology of the human-computer relation and the discussion of the key term activity can be a medium for exchange which other disciplines. Especially when applications become more serious, for example in health care, a typology including a wider mutual understanding can be useful for cooperations with non-technical practitioners e.g. doctors or psychologists.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127570660","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Experimentation in Wet Laboratories requires tracking and identification of small containers like test tubes, flasks, and bottles. The current practise involves colored adhesive markers, waterproof hand-writing, QR- and Barcodes, or RFID-Tags. These markers are often not self-descriptive and require a lookup table on paper or some digitally stored counterpart. Furthermore they are subject to harsh environmental conditions (e.g. samples are kept in a freezer), and can be hard to share with other lab workers for lack of a consistent annotation systems. Increasing their durability, as well as providing a central tracking system for these containers, is therefore of great interest. In this paper we present a system for the implicit tracking of RFID-augmented containers with a wrist-worn reader unit, and a voice-interaction scheme based on a head-mounted display.
{"title":"RFID-based compound identification in wet laboratories with google glass","authors":"P. Scholl, Tobias Schultes, Kristof Van Laerhoven","doi":"10.1145/2790044.2790055","DOIUrl":"https://doi.org/10.1145/2790044.2790055","url":null,"abstract":"Experimentation in Wet Laboratories requires tracking and identification of small containers like test tubes, flasks, and bottles. The current practise involves colored adhesive markers, waterproof hand-writing, QR- and Barcodes, or RFID-Tags. These markers are often not self-descriptive and require a lookup table on paper or some digitally stored counterpart. Furthermore they are subject to harsh environmental conditions (e.g. samples are kept in a freezer), and can be hard to share with other lab workers for lack of a consistent annotation systems. Increasing their durability, as well as providing a central tracking system for these containers, is therefore of great interest. In this paper we present a system for the implicit tracking of RFID-augmented containers with a wrist-worn reader unit, and a voice-interaction scheme based on a head-mounted display.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121120288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mathias Wilhelm, Daniel G. Krakowczyk, Frank Trollmann, S. Albayrak
Since gestures are one of the natural interaction modalities between humans they also represent a promising interaction modality for human computer interaction. Finger rings provide an unobtrusive way to detect hand and finger gestures if they are able to detect a large variety of gestures involving hands and multiple fingers. One method that could be used to detect such gestures with a single ring is electric field sensing. In this paper we present an early prototype, called eRing, which uses this method and evaluate its capability to detect different finger- and hand-gestures via user study.
{"title":"eRing: multiple finger gesture recognition with one ring using an electric field","authors":"Mathias Wilhelm, Daniel G. Krakowczyk, Frank Trollmann, S. Albayrak","doi":"10.1145/2790044.2790047","DOIUrl":"https://doi.org/10.1145/2790044.2790047","url":null,"abstract":"Since gestures are one of the natural interaction modalities between humans they also represent a promising interaction modality for human computer interaction. Finger rings provide an unobtrusive way to detect hand and finger gestures if they are able to detect a large variety of gestures involving hands and multiple fingers. One method that could be used to detect such gestures with a single ring is electric field sensing. In this paper we present an early prototype, called eRing, which uses this method and evaluate its capability to detect different finger- and hand-gestures via user study.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127270994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Meier, Denys J. C. Matthies, B. Urban, Reto Wettach
In this paper, we present an evaluation of vibrotactile onbody feedback for the purpose of pedestrian navigation. For this specific task, many researchers already provide different approaches such as vibrating belts, wristbands or shoes. Still, there are issues left that have to be considered, such as which body position is most suitable, what kind of vibration patterns are easy to interpret, and how applicable are vibrotactile feedback systems in real scenarios. To find answers, we reconstructed prototypes commonly found in literature and continued to further evaluate different foot-related designs. On the one hand, we learned that vibrotactile feedback at the foot reduces visual attention and thus also potentially reduces stress. However, on the other hand, we found that urban space can be very diverse, and ambiguous and therefore a vibrotactile system cannot completely replace common path finding systems for pedestrians. Rather, we envision such a system to be applied complementary as an assistive technology.
{"title":"Exploring vibrotactile feedback on the body and foot for the purpose of pedestrian navigation","authors":"A. Meier, Denys J. C. Matthies, B. Urban, Reto Wettach","doi":"10.1145/2790044.2790051","DOIUrl":"https://doi.org/10.1145/2790044.2790051","url":null,"abstract":"In this paper, we present an evaluation of vibrotactile onbody feedback for the purpose of pedestrian navigation. For this specific task, many researchers already provide different approaches such as vibrating belts, wristbands or shoes. Still, there are issues left that have to be considered, such as which body position is most suitable, what kind of vibration patterns are easy to interpret, and how applicable are vibrotactile feedback systems in real scenarios. To find answers, we reconstructed prototypes commonly found in literature and continued to further evaluate different foot-related designs. On the one hand, we learned that vibrotactile feedback at the foot reduces visual attention and thus also potentially reduces stress. However, on the other hand, we found that urban space can be very diverse, and ambiguous and therefore a vibrotactile system cannot completely replace common path finding systems for pedestrians. Rather, we envision such a system to be applied complementary as an assistive technology.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125858953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper describes an approach towards situation-based annotation management on the basis of work integrated activity recognition and situation detection. We motivate situation-based annotations as a means for collecting and processing contextual knowledge on the work domain in order to improve the quality of information assistance at mobile assembly workplaces. Especially, when we make use of automated processes which aim to detect the worker's ongoing activities and situations, we have to deal at the same time with errors and wrongly inferred assumptions on reality. Here we see the strengths of annotation management which can be used to revise contextual background knowledge, required for determining the autonomous behavior, in case of errors and deviations between inferred and real situations.
{"title":"Plant@Hand: from activity recognition to situation-based annotation management at mobile assembly workplaces","authors":"Rebekka Alm, Mario Aehnelt, B. Urban","doi":"10.1145/2790044.2790057","DOIUrl":"https://doi.org/10.1145/2790044.2790057","url":null,"abstract":"This paper describes an approach towards situation-based annotation management on the basis of work integrated activity recognition and situation detection. We motivate situation-based annotations as a means for collecting and processing contextual knowledge on the work domain in order to improve the quality of information assistance at mobile assembly workplaces. Especially, when we make use of automated processes which aim to detect the worker's ongoing activities and situations, we have to deal at the same time with errors and wrongly inferred assumptions on reality. Here we see the strengths of annotation management which can be used to revise contextual background knowledge, required for determining the autonomous behavior, in case of errors and deviations between inferred and real situations.","PeriodicalId":351171,"journal":{"name":"Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126677522","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}